[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11728 1726882174.73226: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11728 1726882174.74621: Added group all to inventory 11728 1726882174.74623: Added group ungrouped to inventory 11728 1726882174.74627: Group all now contains ungrouped 11728 1726882174.74630: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 11728 1726882175.00511: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11728 1726882175.00570: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11728 1726882175.00600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11728 1726882175.00657: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11728 1726882175.00735: Loaded config def from plugin (inventory/script) 11728 1726882175.00737: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11728 1726882175.00776: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11728 1726882175.00870: Loaded config def from plugin (inventory/yaml) 11728 1726882175.00873: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11728 1726882175.00962: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11728 1726882175.01446: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11728 1726882175.01449: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11728 1726882175.01452: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11728 1726882175.01457: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11728 1726882175.01461: Loading data from /tmp/network-Kc3/inventory.yml 11728 1726882175.01551: /tmp/network-Kc3/inventory.yml was not parsable by auto 11728 1726882175.01641: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11728 1726882175.01679: Loading data from /tmp/network-Kc3/inventory.yml 11728 1726882175.01766: group all already in inventory 11728 1726882175.01773: set inventory_file for managed_node1 11728 1726882175.01776: set inventory_dir for managed_node1 11728 1726882175.01777: Added host managed_node1 to inventory 11728 1726882175.01780: Added host managed_node1 to group all 11728 1726882175.01781: set ansible_host for managed_node1 11728 1726882175.01781: set ansible_ssh_extra_args for managed_node1 11728 1726882175.01785: set inventory_file for managed_node2 11728 1726882175.01787: set inventory_dir for managed_node2 11728 1726882175.01788: Added host managed_node2 to inventory 11728 1726882175.01789: Added host managed_node2 to group all 11728 1726882175.01790: set ansible_host for managed_node2 11728 1726882175.01791: set ansible_ssh_extra_args for managed_node2 11728 1726882175.01801: set inventory_file for managed_node3 11728 1726882175.01804: set inventory_dir for managed_node3 11728 1726882175.01805: Added host managed_node3 to inventory 11728 1726882175.01806: Added host managed_node3 to group all 11728 1726882175.01809: set ansible_host for managed_node3 11728 1726882175.01810: set ansible_ssh_extra_args for managed_node3 11728 1726882175.01812: Reconcile groups and hosts in inventory. 11728 1726882175.01816: Group ungrouped now contains managed_node1 11728 1726882175.01818: Group ungrouped now contains managed_node2 11728 1726882175.01820: Group ungrouped now contains managed_node3 11728 1726882175.01898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11728 1726882175.02029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11728 1726882175.02076: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11728 1726882175.02106: Loaded config def from plugin (vars/host_group_vars) 11728 1726882175.02109: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11728 1726882175.02116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11728 1726882175.02128: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11728 1726882175.02168: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11728 1726882175.02507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882175.02607: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11728 1726882175.02645: Loaded config def from plugin (connection/local) 11728 1726882175.02648: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11728 1726882175.03322: Loaded config def from plugin (connection/paramiko_ssh) 11728 1726882175.03326: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11728 1726882175.04233: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11728 1726882175.04271: Loaded config def from plugin (connection/psrp) 11728 1726882175.04273: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11728 1726882175.05003: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11728 1726882175.05039: Loaded config def from plugin (connection/ssh) 11728 1726882175.05042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11728 1726882175.07022: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11728 1726882175.07059: Loaded config def from plugin (connection/winrm) 11728 1726882175.07062: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11728 1726882175.07102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11728 1726882175.07162: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11728 1726882175.07234: Loaded config def from plugin (shell/cmd) 11728 1726882175.07236: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11728 1726882175.07261: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11728 1726882175.07335: Loaded config def from plugin (shell/powershell) 11728 1726882175.07338: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11728 1726882175.07386: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11728 1726882175.07575: Loaded config def from plugin (shell/sh) 11728 1726882175.07577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11728 1726882175.07613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11728 1726882175.07735: Loaded config def from plugin (become/runas) 11728 1726882175.07742: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11728 1726882175.07929: Loaded config def from plugin (become/su) 11728 1726882175.07932: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11728 1726882175.08099: Loaded config def from plugin (become/sudo) 11728 1726882175.08102: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11728 1726882175.08133: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11728 1726882175.08471: in VariableManager get_vars() 11728 1726882175.08496: done with get_vars() 11728 1726882175.08630: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11728 1726882175.11591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11728 1726882175.11716: in VariableManager get_vars() 11728 1726882175.11721: done with get_vars() 11728 1726882175.11724: variable 'playbook_dir' from source: magic vars 11728 1726882175.11724: variable 'ansible_playbook_python' from source: magic vars 11728 1726882175.11725: variable 'ansible_config_file' from source: magic vars 11728 1726882175.11726: variable 'groups' from source: magic vars 11728 1726882175.11727: variable 'omit' from source: magic vars 11728 1726882175.11727: variable 'ansible_version' from source: magic vars 11728 1726882175.11728: variable 'ansible_check_mode' from source: magic vars 11728 1726882175.11729: variable 'ansible_diff_mode' from source: magic vars 11728 1726882175.11729: variable 'ansible_forks' from source: magic vars 11728 1726882175.11730: variable 'ansible_inventory_sources' from source: magic vars 11728 1726882175.11731: variable 'ansible_skip_tags' from source: magic vars 11728 1726882175.11731: variable 'ansible_limit' from source: magic vars 11728 1726882175.11732: variable 'ansible_run_tags' from source: magic vars 11728 1726882175.11733: variable 'ansible_verbosity' from source: magic vars 11728 1726882175.11770: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml 11728 1726882175.12392: in VariableManager get_vars() 11728 1726882175.12417: done with get_vars() 11728 1726882175.12554: in VariableManager get_vars() 11728 1726882175.12568: done with get_vars() 11728 1726882175.12622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11728 1726882175.12641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11728 1726882175.12873: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11728 1726882175.13037: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11728 1726882175.13039: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11728 1726882175.13074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11728 1726882175.13102: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11728 1726882175.13271: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11728 1726882175.13338: Loaded config def from plugin (callback/default) 11728 1726882175.13341: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11728 1726882175.14484: Loaded config def from plugin (callback/junit) 11728 1726882175.14486: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11728 1726882175.14535: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11728 1726882175.14604: Loaded config def from plugin (callback/minimal) 11728 1726882175.14607: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11728 1726882175.14645: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11728 1726882175.14709: Loaded config def from plugin (callback/tree) 11728 1726882175.14711: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11728 1726882175.14841: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11728 1726882175.14844: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_options_nm.yml ******************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml 11728 1726882175.14872: in VariableManager get_vars() 11728 1726882175.14885: done with get_vars() 11728 1726882175.14890: in VariableManager get_vars() 11728 1726882175.14904: done with get_vars() 11728 1726882175.14908: variable 'omit' from source: magic vars 11728 1726882175.14949: in VariableManager get_vars() 11728 1726882175.14963: done with get_vars() 11728 1726882175.14984: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_options.yml' with nm as provider] ***** 11728 1726882175.15636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11728 1726882175.17623: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11728 1726882175.17768: getting the remaining hosts for this loop 11728 1726882175.17770: done getting the remaining hosts for this loop 11728 1726882175.17774: getting the next task for host managed_node3 11728 1726882175.17777: done getting next task for host managed_node3 11728 1726882175.17778: ^ task is: TASK: Gathering Facts 11728 1726882175.17780: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882175.17782: getting variables 11728 1726882175.17783: in VariableManager get_vars() 11728 1726882175.17860: Calling all_inventory to load vars for managed_node3 11728 1726882175.17864: Calling groups_inventory to load vars for managed_node3 11728 1726882175.17867: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882175.17880: Calling all_plugins_play to load vars for managed_node3 11728 1726882175.17891: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882175.17971: Calling groups_plugins_play to load vars for managed_node3 11728 1726882175.18016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882175.18070: done with get_vars() 11728 1726882175.18077: done getting variables 11728 1726882175.18249: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 Friday 20 September 2024 21:29:35 -0400 (0:00:00.035) 0:00:00.035 ****** 11728 1726882175.18269: entering _queue_task() for managed_node3/gather_facts 11728 1726882175.18270: Creating lock for gather_facts 11728 1726882175.18972: worker is 1 (out of 1 available) 11728 1726882175.18983: exiting _queue_task() for managed_node3/gather_facts 11728 1726882175.18999: done queuing things up, now waiting for results queue to drain 11728 1726882175.19001: waiting for pending results... 11728 1726882175.19151: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11728 1726882175.19223: in run() - task 12673a56-9f93-5c28-a762-000000000015 11728 1726882175.19245: variable 'ansible_search_path' from source: unknown 11728 1726882175.19398: calling self._execute() 11728 1726882175.19403: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882175.19406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882175.19409: variable 'omit' from source: magic vars 11728 1726882175.19471: variable 'omit' from source: magic vars 11728 1726882175.19510: variable 'omit' from source: magic vars 11728 1726882175.19553: variable 'omit' from source: magic vars 11728 1726882175.19614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882175.19658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882175.19683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882175.19710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882175.19732: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882175.19771: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882175.19780: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882175.19788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882175.19899: Set connection var ansible_connection to ssh 11728 1726882175.19936: Set connection var ansible_shell_executable to /bin/sh 11728 1726882175.19939: Set connection var ansible_timeout to 10 11728 1726882175.19941: Set connection var ansible_shell_type to sh 11728 1726882175.19943: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882175.20000: Set connection var ansible_pipelining to False 11728 1726882175.20003: variable 'ansible_shell_executable' from source: unknown 11728 1726882175.20005: variable 'ansible_connection' from source: unknown 11728 1726882175.20007: variable 'ansible_module_compression' from source: unknown 11728 1726882175.20009: variable 'ansible_shell_type' from source: unknown 11728 1726882175.20011: variable 'ansible_shell_executable' from source: unknown 11728 1726882175.20014: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882175.20019: variable 'ansible_pipelining' from source: unknown 11728 1726882175.20027: variable 'ansible_timeout' from source: unknown 11728 1726882175.20035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882175.20255: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (found_in_cache=True, class_only=False) 11728 1726882175.20273: variable 'omit' from source: magic vars 11728 1726882175.20372: starting attempt loop 11728 1726882175.20375: running the handler 11728 1726882175.20377: variable 'ansible_facts' from source: unknown 11728 1726882175.20379: _low_level_execute_command(): starting 11728 1726882175.20381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882175.21137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882175.21158: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882175.21173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882175.21190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882175.21258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.21311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882175.21335: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882175.21381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882175.21476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882175.23113: stdout chunk (state=3): >>>/root <<< 11728 1726882175.23313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882175.23316: stdout chunk (state=3): >>><<< 11728 1726882175.23318: stderr chunk (state=3): >>><<< 11728 1726882175.23455: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882175.23459: _low_level_execute_command(): starting 11728 1726882175.23462: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602 `" && echo ansible-tmp-1726882175.2336113-11776-267505380726602="` echo /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602 `" ) && sleep 0' 11728 1726882175.24934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.24983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882175.25046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882175.25055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882175.25222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882175.27146: stdout chunk (state=3): >>>ansible-tmp-1726882175.2336113-11776-267505380726602=/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602 <<< 11728 1726882175.27291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882175.27298: stdout chunk (state=3): >>><<< 11728 1726882175.27403: stderr chunk (state=3): >>><<< 11728 1726882175.27407: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882175.2336113-11776-267505380726602=/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882175.27420: variable 'ansible_module_compression' from source: unknown 11728 1726882175.27541: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11728 1726882175.27583: ANSIBALLZ: Acquiring lock 11728 1726882175.27591: ANSIBALLZ: Lock acquired: 139840770723472 11728 1726882175.27605: ANSIBALLZ: Creating module 11728 1726882175.85442: ANSIBALLZ: Writing module into payload 11728 1726882175.85863: ANSIBALLZ: Writing module 11728 1726882175.85883: ANSIBALLZ: Renaming module 11728 1726882175.85900: ANSIBALLZ: Done creating module 11728 1726882175.85927: variable 'ansible_facts' from source: unknown 11728 1726882175.86100: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882175.86104: _low_level_execute_command(): starting 11728 1726882175.86106: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11728 1726882175.87447: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882175.87451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882175.87514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.87518: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882175.87973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.88061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882175.89803: stdout chunk (state=3): >>>PLATFORM <<< 11728 1726882175.89809: stdout chunk (state=3): >>>Linux <<< 11728 1726882175.89812: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11728 1726882175.89956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882175.89960: stdout chunk (state=3): >>><<< 11728 1726882175.89966: stderr chunk (state=3): >>><<< 11728 1726882175.89997: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882175.90005 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11728 1726882175.90052: _low_level_execute_command(): starting 11728 1726882175.90058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11728 1726882175.90500: Sending initial data 11728 1726882175.90503: Sent initial data (1181 bytes) 11728 1726882175.91886: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882175.91889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882175.91896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.91899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882175.92000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882175.92009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882175.92330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882175.92373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882175.95787: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11728 1726882175.96135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882175.96205: stderr chunk (state=3): >>><<< 11728 1726882175.96221: stdout chunk (state=3): >>><<< 11728 1726882175.96246: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882175.96513: variable 'ansible_facts' from source: unknown 11728 1726882175.96596: variable 'ansible_facts' from source: unknown 11728 1726882175.96870: variable 'ansible_module_compression' from source: unknown 11728 1726882175.97208: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11728 1726882175.97211: variable 'ansible_facts' from source: unknown 11728 1726882175.97556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py 11728 1726882175.98221: Sending initial data 11728 1726882175.98225: Sent initial data (154 bytes) 11728 1726882175.99732: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882175.99806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882176.00214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882176.00438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882176.02048: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882176.02096: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882176.02157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpxock2x6h /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py <<< 11728 1726882176.02191: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py" <<< 11728 1726882176.02299: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpxock2x6h" to remote "/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py" <<< 11728 1726882176.06159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882176.06228: stderr chunk (state=3): >>><<< 11728 1726882176.06412: stdout chunk (state=3): >>><<< 11728 1726882176.06415: done transferring module to remote 11728 1726882176.06418: _low_level_execute_command(): starting 11728 1726882176.06421: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/ /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py && sleep 0' 11728 1726882176.07972: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882176.07976: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882176.07979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882176.07981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882176.08051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882176.08060: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882176.08063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882176.08227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882176.08288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882176.10508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882176.10512: stdout chunk (state=3): >>><<< 11728 1726882176.10514: stderr chunk (state=3): >>><<< 11728 1726882176.10632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882176.10780: _low_level_execute_command(): starting 11728 1726882176.10784: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/AnsiballZ_setup.py && sleep 0' 11728 1726882176.12115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882176.12167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882176.12226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882176.12263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882176.12330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882176.15063: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 11728 1726882176.15067: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 11728 1726882176.15089: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11728 1726882176.15109: stdout chunk (state=3): >>>import 'codecs' # <<< 11728 1726882176.15145: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11728 1726882176.15198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11728 1726882176.15211: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff64184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff63e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 11728 1726882176.15297: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff641aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 11728 1726882176.15404: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11728 1726882176.15480: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages <<< 11728 1726882176.15538: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11728 1726882176.15640: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff61c9130> <<< 11728 1726882176.15646: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff61c9fa0> <<< 11728 1726882176.15670: stdout chunk (state=3): >>>import 'site' # <<< 11728 1726882176.15722: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11728 1726882176.16050: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11728 1726882176.16141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.16195: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6207da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11728 1726882176.16232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6207fb0> <<< 11728 1726882176.16270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11728 1726882176.16573: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff623f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff623fe00> import '_collections' # <<< 11728 1726882176.16653: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621fa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621d160> <<< 11728 1726882176.16694: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6204f50> <<< 11728 1726882176.16726: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11728 1726882176.16758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11728 1726882176.16867: stdout chunk (state=3): >>>import '_sre' # <<< 11728 1726882176.16901: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11728 1726882176.17026: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625f6b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625cb60> <<< 11728 1726882176.17085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11728 1726882176.17131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62946b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62041d0> <<< 11728 1726882176.17255: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6294b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6294a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6294dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6202cf0> <<< 11728 1726882176.17280: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.17326: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11728 1726882176.17378: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11728 1726882176.17381: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62954c0> <<< 11728 1726882176.17466: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6295190> import 'importlib.machinery' # <<< 11728 1726882176.17478: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62963c0> <<< 11728 1726882176.17570: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11728 1726882176.17678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b05c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.17813: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b1d00> <<< 11728 1726882176.17820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11728 1726882176.17831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b2ba0> <<< 11728 1726882176.17932: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b3200> <<< 11728 1726882176.17943: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11728 1726882176.18023: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b3c80> <<< 11728 1726882176.18026: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b33b0> <<< 11728 1726882176.18176: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6296330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11728 1726882176.18179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11728 1726882176.18357: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fbbbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe46e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe4440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe4710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11728 1726882176.18407: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.18596: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.18608: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe4fe0> <<< 11728 1726882176.18748: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.18765: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe59d0> <<< 11728 1726882176.18827: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe4890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fb9d90> <<< 11728 1726882176.18842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11728 1726882176.18890: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe6db0> <<< 11728 1726882176.18933: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe5af0> <<< 11728 1726882176.18966: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6296ae0> <<< 11728 1726882176.19026: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11728 1726882176.19029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.19108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11728 1726882176.19300: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11728 1726882176.19325: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff600f110> <<< 11728 1726882176.19461: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6033470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11728 1726882176.19624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6094290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11728 1726882176.19736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff60969f0> <<< 11728 1726882176.19800: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff60943b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6061280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59293d0> <<< 11728 1726882176.19816: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6032270> <<< 11728 1726882176.19841: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe7ce0> <<< 11728 1726882176.20108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feff6032870> <<< 11728 1726882176.20338: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_g1g7a_ai/ansible_ansible.legacy.setup_payload.zip' <<< 11728 1726882176.20351: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.20520: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.20530: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11728 1726882176.20578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11728 1726882176.20713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598f170> import '_typing' # <<< 11728 1726882176.20980: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff596e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff596d1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11728 1726882176.23057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.24538: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 11728 1726882176.24554: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.24580: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11728 1726882176.24612: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11728 1726882176.24649: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59beb40> <<< 11728 1726882176.24714: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be900> <<< 11728 1726882176.24755: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11728 1726882176.24758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11728 1726882176.24810: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598fb90> import 'atexit' # <<< 11728 1726882176.24867: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59bf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59bfad0> <<< 11728 1726882176.24944: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11728 1726882176.24953: stdout chunk (state=3): >>>import '_locale' # <<< 11728 1726882176.25016: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59bffb0> <<< 11728 1726882176.25021: stdout chunk (state=3): >>>import 'pwd' # <<< 11728 1726882176.25054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11728 1726882176.25077: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11728 1726882176.25135: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5829c10> <<< 11728 1726882176.25203: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff582b3e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11728 1726882176.25206: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11728 1726882176.25255: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582c290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11728 1726882176.25299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11728 1726882176.25332: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582d3d0> <<< 11728 1726882176.25392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11728 1726882176.25416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11728 1726882176.25478: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582fec0> <<< 11728 1726882176.25650: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6202de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582e180> <<< 11728 1726882176.25667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11728 1726882176.25922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 11728 1726882176.25925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5837ef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58369c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5836720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11728 1726882176.26036: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5836c90> <<< 11728 1726882176.26068: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582e690> <<< 11728 1726882176.26116: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff587bec0> <<< 11728 1726882176.26136: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587c0e0> <<< 11728 1726882176.26175: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11728 1726882176.26222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11728 1726882176.26260: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff587dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11728 1726882176.26366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5880200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587e300> <<< 11728 1726882176.26438: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11728 1726882176.26483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.26524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 11728 1726882176.26564: stdout chunk (state=3): >>> import '_string' # <<< 11728 1726882176.26664: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58839e0> <<< 11728 1726882176.26818: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58803b0><<< 11728 1726882176.26829: stdout chunk (state=3): >>> <<< 11728 1726882176.26910: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.26926: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.26979: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff58847a0> <<< 11728 1726882176.27003: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.27134: stdout chunk (state=3): >>> import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5884a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.27149: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5884da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587c320> <<< 11728 1726882176.27199: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py<<< 11728 1726882176.27222: stdout chunk (state=3): >>> <<< 11728 1726882176.27276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.27282: stdout chunk (state=3): >>> <<< 11728 1726882176.27320: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.27329: stdout chunk (state=3): >>> <<< 11728 1726882176.27547: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57103e0> <<< 11728 1726882176.27590: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.27617: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.27630: stdout chunk (state=3): >>> <<< 11728 1726882176.27641: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57113d0> <<< 11728 1726882176.27672: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5886b70> <<< 11728 1726882176.27716: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.27731: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.27743: stdout chunk (state=3): >>> <<< 11728 1726882176.27757: stdout chunk (state=3): >>>import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5887f20> <<< 11728 1726882176.27780: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58867b0> <<< 11728 1726882176.27810: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.27841: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.27846: stdout chunk (state=3): >>> <<< 11728 1726882176.27871: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # <<< 11728 1726882176.27911: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.27916: stdout chunk (state=3): >>> <<< 11728 1726882176.28049: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.28053: stdout chunk (state=3): >>> <<< 11728 1726882176.28194: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.28204: stdout chunk (state=3): >>> <<< 11728 1726882176.28233: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11728 1726882176.28256: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882176.28283: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882176.28297: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 11728 1726882176.28325: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882176.28444: stdout chunk (state=3): >>> <<< 11728 1726882176.28531: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.28537: stdout chunk (state=3): >>> <<< 11728 1726882176.28730: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.28737: stdout chunk (state=3): >>> <<< 11728 1726882176.29650: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.30543: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11728 1726882176.30571: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 11728 1726882176.30576: stdout chunk (state=3): >>> <<< 11728 1726882176.30608: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 11728 1726882176.30624: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 11728 1726882176.30665: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 11728 1726882176.30670: stdout chunk (state=3): >>> <<< 11728 1726882176.30700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882176.30774: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882176.30786: stdout chunk (state=3): >>> <<< 11728 1726882176.30806: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.30819: stdout chunk (state=3): >>>import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57156d0> <<< 11728 1726882176.30939: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 11728 1726882176.32477: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5716540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5711610> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 11728 1726882176.32483: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5716c30> # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.33050: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33056: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33152: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11728 1726882176.33165: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33210: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33258: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11728 1726882176.33264: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33367: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33486: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11728 1726882176.33521: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33524: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.33526: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 11728 1726882176.33645: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11728 1726882176.33996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.34359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11728 1726882176.34444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11728 1726882176.34456: stdout chunk (state=3): >>>import '_ast' # <<< 11728 1726882176.34558: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57176e0> <<< 11728 1726882176.34562: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.34670: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.34771: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11728 1726882176.34782: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 11728 1726882176.34787: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 11728 1726882176.34814: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.35048: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.35108: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.35204: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11728 1726882176.35256: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.35366: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57222d0> <<< 11728 1726882176.35418: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff571d0a0> <<< 11728 1726882176.35464: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 11728 1726882176.35468: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11728 1726882176.35564: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.35646: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.35683: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.35737: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 11728 1726882176.35742: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.35758: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11728 1726882176.35795: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11728 1726882176.35808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11728 1726882176.35892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11728 1726882176.36153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff580ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58fe900> <<< 11728 1726882176.36172: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5722060> # destroy ansible.module_utils.distro <<< 11728 1726882176.36179: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # <<< 11728 1726882176.36186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36222: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36255: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11728 1726882176.36334: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11728 1726882176.36352: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36375: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 11728 1726882176.36380: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36464: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36548: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36570: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36597: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36651: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36706: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36754: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11728 1726882176.36812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.36915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.37022: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.37053: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882176.37058: stdout chunk (state=3): >>> <<< 11728 1726882176.37096: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11728 1726882176.37110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.37373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.37852: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b6360> <<< 11728 1726882176.37883: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11728 1726882176.37896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11728 1726882176.37922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11728 1726882176.37971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11728 1726882176.38000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11728 1726882176.38019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11728 1726882176.38029: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e42f0> <<< 11728 1726882176.38068: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.38081: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.38087: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e4560> <<< 11728 1726882176.38151: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff579c8f0> <<< 11728 1726882176.38172: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b6f00> <<< 11728 1726882176.38207: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b4a40> <<< 11728 1726882176.38226: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b5430> <<< 11728 1726882176.38237: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11728 1726882176.38303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11728 1726882176.38323: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11728 1726882176.38336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11728 1726882176.38358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 11728 1726882176.38381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11728 1726882176.38403: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.38406: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e7650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e6f00> <<< 11728 1726882176.38654: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e70e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e6330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e7830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11728 1726882176.38701: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11728 1726882176.38727: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.38733: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5446300> <<< 11728 1726882176.38769: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5444350> <<< 11728 1726882176.38808: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b47a0> <<< 11728 1726882176.38821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 11728 1726882176.38837: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11728 1726882176.38841: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.38866: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.38869: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 11728 1726882176.38872: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.38955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11728 1726882176.39049: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11728 1726882176.39190: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39207: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11728 1726882176.39354: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 11728 1726882176.39369: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11728 1726882176.39498: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39549: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11728 1726882176.39566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39719: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39800: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.39872: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 11728 1726882176.39895: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.40649: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11728 1726882176.41445: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.41467: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41509: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41556: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 11728 1726882176.41569: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41610: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11728 1726882176.41663: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41739: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41810: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11728 1726882176.41830: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41873: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.41899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11728 1726882176.41964: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.41997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11728 1726882176.42011: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42104: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42366: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff54478c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11728 1726882176.42484: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5446f30> import 'ansible.module_utils.facts.system.local' # <<< 11728 1726882176.42490: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42671: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11728 1726882176.42691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42815: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.42948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11728 1726882176.42954: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43047: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11728 1726882176.43161: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11728 1726882176.43345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11728 1726882176.43411: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.43548: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5486510> <<< 11728 1726882176.43785: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff54762d0> import 'ansible.module_utils.facts.system.python' # <<< 11728 1726882176.43804: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43882: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.43952: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11728 1726882176.44144: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.44203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44368: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44574: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 11728 1726882176.44586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 11728 1726882176.44595: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44643: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44697: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11728 1726882176.44703: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44756: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11728 1726882176.44824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11728 1726882176.44864: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.44885: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5499a90> <<< 11728 1726882176.44888: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5484620> import 'ansible.module_utils.facts.system.user' # <<< 11728 1726882176.44915: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.44925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 11728 1726882176.44938: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.44992: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.45540: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11728 1726882176.45543: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45673: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45822: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 11728 1726882176.45947: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.45996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.46016: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.46214: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.46440: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11728 1726882176.46459: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.46618: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.46807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11728 1726882176.46828: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.47055: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.47725: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.48518: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11728 1726882176.48525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 11728 1726882176.48644: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.48682: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.48840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11728 1726882176.48848: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.48979: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.49126: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11728 1726882176.49133: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.49359: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.49587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11728 1726882176.49612: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.49618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 11728 1726882176.49854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 11728 1726882176.49895: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50041: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50345: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 11728 1726882176.50656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 11728 1726882176.50662: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50753: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11728 1726882176.50771: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50799: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11728 1726882176.50832: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.50931: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11728 1726882176.51036: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51063: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51101: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 11728 1726882176.51109: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51241: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51312: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11728 1726882176.51334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51401: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11728 1726882176.51559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.51892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11728 1726882176.52308: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52389: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52466: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11728 1726882176.52544: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.52564: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11728 1726882176.52570: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52618: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11728 1726882176.52670: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52755: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11728 1726882176.52765: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.52980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11728 1726882176.53004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53014: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 11728 1726882176.53033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53096: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53256: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.53288: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53332: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53454: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53559: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11728 1726882176.53562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11728 1726882176.53587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53640: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.53756: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11728 1726882176.53997: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.54300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11728 1726882176.54555: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882176.54563: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11728 1726882176.54572: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.54694: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.54816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 11728 1726882176.54821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 11728 1726882176.54823: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.54944: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.55077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11728 1726882176.55083: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11728 1726882176.55183: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882176.56190: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11728 1726882176.56196: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11728 1726882176.56199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11728 1726882176.56214: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11728 1726882176.56242: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882176.56254: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff529e570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff529f050> <<< 11728 1726882176.56310: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5295130> <<< 11728 1726882176.67227: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11728 1726882176.67231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 11728 1726882176.67278: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e62a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 11728 1726882176.67282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 11728 1726882176.67306: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e4dd0> <<< 11728 1726882176.67361: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 11728 1726882176.67392: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882176.67410: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e6690> <<< 11728 1726882176.67443: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e5d60> <<< 11728 1726882176.67701: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11728 1726882176.94114: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "36", "epoch": "1726882176", "epoch_int": "1726882176", "date": "2024-09-20", "time": "21:29:36", "iso8601_micro": "2024-09-21T01:29:36.553099Z", "iso8601": "2024-09-21T01:29:36Z", "iso8601_basic": "20240920T212936553099", "iso8601_basic_short": "20240920T212936", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2971, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 560, "free": 2971}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 483, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805363200, "block_size": 4096, "block_total": 65519099, "block_available": 63917325, "block_used": 1601774, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.2197265625, "5m": 0.18896484375, "15m": 0.1083984375}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11728 1726882176.94648: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11728 1726882176.94698: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path<<< 11728 1726882176.94753: stdout chunk (state=3): >>> # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value<<< 11728 1726882176.94757: stdout chunk (state=3): >>> # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys<<< 11728 1726882176.94866: stdout chunk (state=3): >>> # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 11728 1726882176.94870: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc<<< 11728 1726882176.94872: stdout chunk (state=3): >>> # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack<<< 11728 1726882176.94911: stdout chunk (state=3): >>> # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib<<< 11728 1726882176.94933: stdout chunk (state=3): >>> # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset<<< 11728 1726882176.94953: stdout chunk (state=3): >>> # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse<<< 11728 1726882176.94976: stdout chunk (state=3): >>> # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing<<< 11728 1726882176.95000: stdout chunk (state=3): >>> # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json<<< 11728 1726882176.95059: stdout chunk (state=3): >>> # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd<<< 11728 1726882176.95228: stdout chunk (state=3): >>> # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog<<< 11728 1726882176.95239: stdout chunk (state=3): >>> # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file <<< 11728 1726882176.95267: stdout chunk (state=3): >>># destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils<<< 11728 1726882176.95309: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction<<< 11728 1726882176.95358: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env<<< 11728 1726882176.95389: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version<<< 11728 1726882176.95427: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd<<< 11728 1726882176.95446: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd<<< 11728 1726882176.95477: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd<<< 11728 1726882176.95531: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter<<< 11728 1726882176.95537: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb<<< 11728 1726882176.95564: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr<<< 11728 1726882176.95625: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme<<< 11728 1726882176.95647: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize <<< 11728 1726882176.95777: stdout chunk (state=3): >>># cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11728 1726882176.96194: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11728 1726882176.96209: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc <<< 11728 1726882176.96232: stdout chunk (state=3): >>># destroy importlib.util <<< 11728 1726882176.96289: stdout chunk (state=3): >>># destroy _bz2 <<< 11728 1726882176.96319: stdout chunk (state=3): >>># destroy _compression # destroy _lzma<<< 11728 1726882176.96322: stdout chunk (state=3): >>> <<< 11728 1726882176.96344: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path<<< 11728 1726882176.96374: stdout chunk (state=3): >>> # destroy zipfile<<< 11728 1726882176.96382: stdout chunk (state=3): >>> <<< 11728 1726882176.96396: stdout chunk (state=3): >>># destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 11728 1726882176.96439: stdout chunk (state=3): >>> # destroy ntpath <<< 11728 1726882176.96472: stdout chunk (state=3): >>># destroy importlib <<< 11728 1726882176.96487: stdout chunk (state=3): >>># destroy zipimport<<< 11728 1726882176.96497: stdout chunk (state=3): >>> # destroy __main__<<< 11728 1726882176.96521: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 11728 1726882176.96529: stdout chunk (state=3): >>> # destroy json.encoder<<< 11728 1726882176.96546: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json<<< 11728 1726882176.96572: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 11728 1726882176.96578: stdout chunk (state=3): >>> <<< 11728 1726882176.96598: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal <<< 11728 1726882176.96624: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog<<< 11728 1726882176.96647: stdout chunk (state=3): >>> # destroy uuid <<< 11728 1726882176.96712: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro <<< 11728 1726882176.96747: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging <<< 11728 1726882176.96787: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors<<< 11728 1726882176.96806: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing<<< 11728 1726882176.96817: stdout chunk (state=3): >>> # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal<<< 11728 1726882176.96844: stdout chunk (state=3): >>> # destroy pickle<<< 11728 1726882176.96870: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle<<< 11728 1726882176.96878: stdout chunk (state=3): >>> <<< 11728 1726882176.96909: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 11728 1726882176.96914: stdout chunk (state=3): >>> <<< 11728 1726882176.96957: stdout chunk (state=3): >>># destroy selectors # destroy shlex<<< 11728 1726882176.96960: stdout chunk (state=3): >>> # destroy fcntl<<< 11728 1726882176.96987: stdout chunk (state=3): >>> # destroy datetime<<< 11728 1726882176.97001: stdout chunk (state=3): >>> # destroy subprocess<<< 11728 1726882176.97017: stdout chunk (state=3): >>> # destroy base64<<< 11728 1726882176.97048: stdout chunk (state=3): >>> # destroy _ssl<<< 11728 1726882176.97069: stdout chunk (state=3): >>> <<< 11728 1726882176.97089: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 11728 1726882176.97133: stdout chunk (state=3): >>> # destroy getpass # destroy pwd # destroy termios # destroy json<<< 11728 1726882176.97162: stdout chunk (state=3): >>> # destroy socket # destroy struct<<< 11728 1726882176.97201: stdout chunk (state=3): >>> # destroy glob # destroy fnmatch<<< 11728 1726882176.97229: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 11728 1726882176.97247: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 11728 1726882176.97500: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11728 1726882176.97503: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib <<< 11728 1726882176.97508: stdout chunk (state=3): >>># cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11728 1726882176.97830: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11728 1726882176.98000: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11728 1726882176.98061: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools <<< 11728 1726882176.98265: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11728 1726882176.98604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882176.98607: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 11728 1726882176.98800: stderr chunk (state=3): >>><<< 11728 1726882176.98803: stdout chunk (state=3): >>><<< 11728 1726882176.98917: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff64184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff63e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff641aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff61c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff61c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6207da0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6207fb0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff623f770> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff623fe00> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621fa40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621d160> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6204f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625f6b0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625e2d0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff621e030> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff625cb60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62946b0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62041d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6294b60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6294a10> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6294dd0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6202cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62954c0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6295190> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62963c0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b05c0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b1d00> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b2ba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b3200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b20f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff62b3c80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff62b33b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6296330> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fbbbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe46e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe4440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe4710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe4fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5fe59d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe4890> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fb9d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe6db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe5af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6296ae0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff600f110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6033470> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6094290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff60969f0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff60943b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6061280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59293d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff6032270> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5fe7ce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feff6032870> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_g1g7a_ai/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598f170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff596e060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff596d1f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598d010> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59beb40> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff598fb90> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59bf890> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff59bfad0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff59bffb0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5829c10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff582b3e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582c290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582d3d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582fec0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff6202de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582e180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5837ef0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58369c0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5836720> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5836c90> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff582e690> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff587bec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587c0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff587dc10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587d9d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5880200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587e300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58839e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58803b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff58847a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5884a10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5884da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff587c320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57103e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57113d0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5886b70> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5887f20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58867b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57156d0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5716540> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5711610> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5716c30> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57176e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff57222d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff571d0a0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff580ac30> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff58fe900> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57224e0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5722060> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b6360> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e42f0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e4560> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff579c8f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b6f00> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b4a40> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b5430> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e7650> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e6f00> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff53e70e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e6330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff53e7830> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5446300> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5444350> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff57b47a0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff54478c0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5446f30> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5486510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff54762d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff5499a90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5484620> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feff529e570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff529f050> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff5295130> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e62a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e4dd0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e6690> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feff52e5d60> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "36", "epoch": "1726882176", "epoch_int": "1726882176", "date": "2024-09-20", "time": "21:29:36", "iso8601_micro": "2024-09-21T01:29:36.553099Z", "iso8601": "2024-09-21T01:29:36Z", "iso8601_basic": "20240920T212936553099", "iso8601_basic_short": "20240920T212936", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_apparmor": {"status": "disabled"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2971, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 560, "free": 2971}, "nocache": {"free": 3285, "used": 246}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 483, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805363200, "block_size": 4096, "block_total": 65519099, "block_available": 63917325, "block_used": 1601774, "inode_total": 131070960, "inode_available": 131029134, "inode_used": 41826, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_loadavg": {"1m": 0.2197265625, "5m": 0.18896484375, "15m": 0.1083984375}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11728 1726882177.00774: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882177.00777: _low_level_execute_command(): starting 11728 1726882177.00779: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882175.2336113-11776-267505380726602/ > /dev/null 2>&1 && sleep 0' 11728 1726882177.00781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882177.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.00785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.00787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882177.00789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.00791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.00796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11728 1726882177.02903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882177.02907: stdout chunk (state=3): >>><<< 11728 1726882177.02909: stderr chunk (state=3): >>><<< 11728 1726882177.03022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11728 1726882177.03030: handler run complete 11728 1726882177.03200: variable 'ansible_facts' from source: unknown 11728 1726882177.03500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.04541: variable 'ansible_facts' from source: unknown 11728 1726882177.04759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.04960: attempt loop complete, returning result 11728 1726882177.04963: _execute() done 11728 1726882177.04966: dumping result to json 11728 1726882177.04999: done dumping result, returning 11728 1726882177.05197: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-5c28-a762-000000000015] 11728 1726882177.05201: sending task result for task 12673a56-9f93-5c28-a762-000000000015 11728 1726882177.05991: done sending task result for task 12673a56-9f93-5c28-a762-000000000015 11728 1726882177.05998: WORKER PROCESS EXITING ok: [managed_node3] 11728 1726882177.06515: no more pending results, returning what we have 11728 1726882177.06518: results queue empty 11728 1726882177.06519: checking for any_errors_fatal 11728 1726882177.06697: done checking for any_errors_fatal 11728 1726882177.06699: checking for max_fail_percentage 11728 1726882177.06700: done checking for max_fail_percentage 11728 1726882177.06701: checking to see if all hosts have failed and the running result is not ok 11728 1726882177.06702: done checking to see if all hosts have failed 11728 1726882177.06703: getting the remaining hosts for this loop 11728 1726882177.06704: done getting the remaining hosts for this loop 11728 1726882177.06709: getting the next task for host managed_node3 11728 1726882177.06715: done getting next task for host managed_node3 11728 1726882177.06717: ^ task is: TASK: meta (flush_handlers) 11728 1726882177.06719: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882177.06723: getting variables 11728 1726882177.06727: in VariableManager get_vars() 11728 1726882177.06811: Calling all_inventory to load vars for managed_node3 11728 1726882177.06814: Calling groups_inventory to load vars for managed_node3 11728 1726882177.06817: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882177.06828: Calling all_plugins_play to load vars for managed_node3 11728 1726882177.06830: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882177.06833: Calling groups_plugins_play to load vars for managed_node3 11728 1726882177.07268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.08087: done with get_vars() 11728 1726882177.08205: done getting variables 11728 1726882177.08269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 11728 1726882177.08465: in VariableManager get_vars() 11728 1726882177.08475: Calling all_inventory to load vars for managed_node3 11728 1726882177.08478: Calling groups_inventory to load vars for managed_node3 11728 1726882177.08480: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882177.08485: Calling all_plugins_play to load vars for managed_node3 11728 1726882177.08487: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882177.08490: Calling groups_plugins_play to load vars for managed_node3 11728 1726882177.09102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.09555: done with get_vars() 11728 1726882177.09569: done queuing things up, now waiting for results queue to drain 11728 1726882177.09572: results queue empty 11728 1726882177.09572: checking for any_errors_fatal 11728 1726882177.09575: done checking for any_errors_fatal 11728 1726882177.09581: checking for max_fail_percentage 11728 1726882177.09582: done checking for max_fail_percentage 11728 1726882177.09583: checking to see if all hosts have failed and the running result is not ok 11728 1726882177.09584: done checking to see if all hosts have failed 11728 1726882177.09584: getting the remaining hosts for this loop 11728 1726882177.09585: done getting the remaining hosts for this loop 11728 1726882177.09588: getting the next task for host managed_node3 11728 1726882177.09592: done getting next task for host managed_node3 11728 1726882177.09599: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11728 1726882177.09600: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882177.09603: getting variables 11728 1726882177.09604: in VariableManager get_vars() 11728 1726882177.09612: Calling all_inventory to load vars for managed_node3 11728 1726882177.09614: Calling groups_inventory to load vars for managed_node3 11728 1726882177.09616: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882177.09699: Calling all_plugins_play to load vars for managed_node3 11728 1726882177.09702: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882177.09706: Calling groups_plugins_play to load vars for managed_node3 11728 1726882177.09960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.10344: done with get_vars() 11728 1726882177.10351: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:11 Friday 20 September 2024 21:29:37 -0400 (0:00:01.922) 0:00:01.957 ****** 11728 1726882177.10549: entering _queue_task() for managed_node3/include_tasks 11728 1726882177.10551: Creating lock for include_tasks 11728 1726882177.11844: worker is 1 (out of 1 available) 11728 1726882177.11855: exiting _queue_task() for managed_node3/include_tasks 11728 1726882177.11865: done queuing things up, now waiting for results queue to drain 11728 1726882177.11866: waiting for pending results... 11728 1726882177.12413: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 11728 1726882177.12419: in run() - task 12673a56-9f93-5c28-a762-000000000006 11728 1726882177.12424: variable 'ansible_search_path' from source: unknown 11728 1726882177.12584: calling self._execute() 11728 1726882177.12666: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882177.12737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882177.12752: variable 'omit' from source: magic vars 11728 1726882177.12974: _execute() done 11728 1726882177.12983: dumping result to json 11728 1726882177.12990: done dumping result, returning 11728 1726882177.13007: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-5c28-a762-000000000006] 11728 1726882177.13062: sending task result for task 12673a56-9f93-5c28-a762-000000000006 11728 1726882177.13350: no more pending results, returning what we have 11728 1726882177.13355: in VariableManager get_vars() 11728 1726882177.13388: Calling all_inventory to load vars for managed_node3 11728 1726882177.13391: Calling groups_inventory to load vars for managed_node3 11728 1726882177.13501: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882177.13520: Calling all_plugins_play to load vars for managed_node3 11728 1726882177.13524: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882177.13528: Calling groups_plugins_play to load vars for managed_node3 11728 1726882177.14031: done sending task result for task 12673a56-9f93-5c28-a762-000000000006 11728 1726882177.14035: WORKER PROCESS EXITING 11728 1726882177.14172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.14475: done with get_vars() 11728 1726882177.14483: variable 'ansible_search_path' from source: unknown 11728 1726882177.14635: we have included files to process 11728 1726882177.14636: generating all_blocks data 11728 1726882177.14638: done generating all_blocks data 11728 1726882177.14639: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11728 1726882177.14640: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11728 1726882177.14643: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11728 1726882177.16336: in VariableManager get_vars() 11728 1726882177.16353: done with get_vars() 11728 1726882177.16365: done processing included file 11728 1726882177.16367: iterating over new_blocks loaded from include file 11728 1726882177.16369: in VariableManager get_vars() 11728 1726882177.16378: done with get_vars() 11728 1726882177.16380: filtering new block on tags 11728 1726882177.16712: done filtering new block on tags 11728 1726882177.16716: in VariableManager get_vars() 11728 1726882177.16727: done with get_vars() 11728 1726882177.16729: filtering new block on tags 11728 1726882177.16745: done filtering new block on tags 11728 1726882177.16747: in VariableManager get_vars() 11728 1726882177.16758: done with get_vars() 11728 1726882177.16759: filtering new block on tags 11728 1726882177.16772: done filtering new block on tags 11728 1726882177.16773: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 11728 1726882177.16780: extending task lists for all hosts with included blocks 11728 1726882177.16854: done extending task lists 11728 1726882177.16855: done processing included files 11728 1726882177.16856: results queue empty 11728 1726882177.16857: checking for any_errors_fatal 11728 1726882177.16858: done checking for any_errors_fatal 11728 1726882177.16859: checking for max_fail_percentage 11728 1726882177.16860: done checking for max_fail_percentage 11728 1726882177.16860: checking to see if all hosts have failed and the running result is not ok 11728 1726882177.16861: done checking to see if all hosts have failed 11728 1726882177.16862: getting the remaining hosts for this loop 11728 1726882177.16863: done getting the remaining hosts for this loop 11728 1726882177.16865: getting the next task for host managed_node3 11728 1726882177.16869: done getting next task for host managed_node3 11728 1726882177.16870: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11728 1726882177.16873: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882177.16875: getting variables 11728 1726882177.16876: in VariableManager get_vars() 11728 1726882177.16885: Calling all_inventory to load vars for managed_node3 11728 1726882177.16887: Calling groups_inventory to load vars for managed_node3 11728 1726882177.16889: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882177.17045: Calling all_plugins_play to load vars for managed_node3 11728 1726882177.17049: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882177.17052: Calling groups_plugins_play to load vars for managed_node3 11728 1726882177.17342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882177.17780: done with get_vars() 11728 1726882177.17789: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:29:37 -0400 (0:00:00.074) 0:00:02.032 ****** 11728 1726882177.17964: entering _queue_task() for managed_node3/setup 11728 1726882177.18805: worker is 1 (out of 1 available) 11728 1726882177.18815: exiting _queue_task() for managed_node3/setup 11728 1726882177.18828: done queuing things up, now waiting for results queue to drain 11728 1726882177.18829: waiting for pending results... 11728 1726882177.19133: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 11728 1726882177.19238: in run() - task 12673a56-9f93-5c28-a762-000000000026 11728 1726882177.19323: variable 'ansible_search_path' from source: unknown 11728 1726882177.19331: variable 'ansible_search_path' from source: unknown 11728 1726882177.19368: calling self._execute() 11728 1726882177.19539: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882177.19703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882177.19707: variable 'omit' from source: magic vars 11728 1726882177.20804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882177.25267: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882177.25448: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882177.25544: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882177.25657: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882177.25732: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882177.25924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882177.26043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882177.26151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882177.26204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882177.26231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882177.26477: variable 'ansible_facts' from source: unknown 11728 1726882177.26578: variable 'network_test_required_facts' from source: task vars 11728 1726882177.26624: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11728 1726882177.26635: variable 'omit' from source: magic vars 11728 1726882177.26681: variable 'omit' from source: magic vars 11728 1726882177.26719: variable 'omit' from source: magic vars 11728 1726882177.26747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882177.26786: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882177.26810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882177.26832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882177.26880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882177.26883: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882177.26891: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882177.26901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882177.26998: Set connection var ansible_connection to ssh 11728 1726882177.27016: Set connection var ansible_shell_executable to /bin/sh 11728 1726882177.27098: Set connection var ansible_timeout to 10 11728 1726882177.27101: Set connection var ansible_shell_type to sh 11728 1726882177.27104: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882177.27106: Set connection var ansible_pipelining to False 11728 1726882177.27108: variable 'ansible_shell_executable' from source: unknown 11728 1726882177.27110: variable 'ansible_connection' from source: unknown 11728 1726882177.27112: variable 'ansible_module_compression' from source: unknown 11728 1726882177.27114: variable 'ansible_shell_type' from source: unknown 11728 1726882177.27116: variable 'ansible_shell_executable' from source: unknown 11728 1726882177.27118: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882177.27119: variable 'ansible_pipelining' from source: unknown 11728 1726882177.27124: variable 'ansible_timeout' from source: unknown 11728 1726882177.27127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882177.27259: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882177.27276: variable 'omit' from source: magic vars 11728 1726882177.27286: starting attempt loop 11728 1726882177.27292: running the handler 11728 1726882177.27321: _low_level_execute_command(): starting 11728 1726882177.27334: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882177.28276: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882177.28402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.28656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.28699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882177.30295: stdout chunk (state=3): >>>/root <<< 11728 1726882177.30467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882177.30491: stdout chunk (state=3): >>><<< 11728 1726882177.30499: stderr chunk (state=3): >>><<< 11728 1726882177.30755: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882177.30767: _low_level_execute_command(): starting 11728 1726882177.30771: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161 `" && echo ansible-tmp-1726882177.3061981-11863-156945425246161="` echo /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161 `" ) && sleep 0' 11728 1726882177.31776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.31822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.32010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882177.32014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.32071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882177.33920: stdout chunk (state=3): >>>ansible-tmp-1726882177.3061981-11863-156945425246161=/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161 <<< 11728 1726882177.34073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882177.34085: stdout chunk (state=3): >>><<< 11728 1726882177.34102: stderr chunk (state=3): >>><<< 11728 1726882177.34129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882177.3061981-11863-156945425246161=/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882177.34198: variable 'ansible_module_compression' from source: unknown 11728 1726882177.34266: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11728 1726882177.34329: variable 'ansible_facts' from source: unknown 11728 1726882177.34565: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py 11728 1726882177.34808: Sending initial data 11728 1726882177.34826: Sent initial data (154 bytes) 11728 1726882177.35451: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882177.35472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.35487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882177.35517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882177.35556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882177.35575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882177.35661: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.35690: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.35772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882177.37282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882177.37361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882177.37416: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzmvb1zra /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py <<< 11728 1726882177.37420: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py" <<< 11728 1726882177.37499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzmvb1zra" to remote "/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py" <<< 11728 1726882177.39032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882177.39212: stderr chunk (state=3): >>><<< 11728 1726882177.39216: stdout chunk (state=3): >>><<< 11728 1726882177.39218: done transferring module to remote 11728 1726882177.39222: _low_level_execute_command(): starting 11728 1726882177.39224: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/ /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py && sleep 0' 11728 1726882177.39690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.39706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.39718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.39765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.39787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.39828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11728 1726882177.42133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882177.42179: stderr chunk (state=3): >>><<< 11728 1726882177.42183: stdout chunk (state=3): >>><<< 11728 1726882177.42201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11728 1726882177.42205: _low_level_execute_command(): starting 11728 1726882177.42229: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/AnsiballZ_setup.py && sleep 0' 11728 1726882177.42887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.42890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882177.42897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882177.42900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.42902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.42904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882177.42907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.42918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882177.42933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.42952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.43032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882177.45918: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11728 1726882177.45953: stdout chunk (state=3): >>>import _imp # builtin <<< 11728 1726882177.45984: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11728 1726882177.46050: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11728 1726882177.46083: stdout chunk (state=3): >>>import 'posix' # <<< 11728 1726882177.46122: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11728 1726882177.46155: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11728 1726882177.46212: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.46237: stdout chunk (state=3): >>>import '_codecs' # <<< 11728 1726882177.46249: stdout chunk (state=3): >>>import 'codecs' # <<< 11728 1726882177.46285: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11728 1726882177.46317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11728 1726882177.46348: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13562184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 11728 1726882177.46367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135621aa50> <<< 11728 1726882177.46400: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 11728 1726882177.46450: stdout chunk (state=3): >>>import 'io' # <<< 11728 1726882177.46454: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11728 1726882177.46535: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11728 1726882177.46551: stdout chunk (state=3): >>>import 'genericpath' # <<< 11728 1726882177.46561: stdout chunk (state=3): >>>import 'posixpath' # <<< 11728 1726882177.46602: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 11728 1726882177.46614: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 11728 1726882177.46922: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 11728 1726882177.46927: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135602d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135602dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11728 1726882177.47457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11728 1726882177.47486: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.47550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11728 1726882177.47581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11728 1726882177.47614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135606be90> <<< 11728 1726882177.47637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11728 1726882177.47688: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 11728 1726882177.47691: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135606bf50> <<< 11728 1726882177.47706: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11728 1726882177.47748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11728 1726882177.47819: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11728 1726882177.47824: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.47840: stdout chunk (state=3): >>>import 'itertools' # <<< 11728 1726882177.47899: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 11728 1726882177.47914: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 11728 1726882177.47920: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560a3ec0> <<< 11728 1726882177.47985: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356083b60> <<< 11728 1726882177.48003: stdout chunk (state=3): >>>import '_functools' # <<< 11728 1726882177.48032: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356081280> <<< 11728 1726882177.48191: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356069040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11728 1726882177.48218: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11728 1726882177.48246: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11728 1726882177.48301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11728 1726882177.48303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11728 1726882177.48353: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c23f0> <<< 11728 1726882177.48386: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356082150> <<< 11728 1726882177.48391: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c0c20> <<< 11728 1726882177.48456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 11728 1726882177.48484: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11728 1726882177.48606: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13560f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13560f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356066de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.48667: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f9610> <<< 11728 1726882177.48709: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f92e0> <<< 11728 1726882177.48713: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11728 1726882177.48730: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fa510> <<< 11728 1726882177.48760: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11728 1726882177.48780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11728 1726882177.48815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11728 1726882177.48847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 11728 1726882177.48884: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356110710> import 'errno' # <<< 11728 1726882177.48887: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.48927: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1356111df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11728 1726882177.48933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11728 1726882177.48955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 11728 1726882177.48976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356112c90> <<< 11728 1726882177.49019: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13561132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561121e0> <<< 11728 1726882177.49060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11728 1726882177.49099: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1356113d70> <<< 11728 1726882177.49130: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561134a0> <<< 11728 1726882177.49171: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fa540> <<< 11728 1726882177.49199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11728 1726882177.49233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11728 1726882177.49260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11728 1726882177.49305: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.49312: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e2fbf0> <<< 11728 1726882177.49335: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11728 1726882177.49378: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e586e0> <<< 11728 1726882177.49385: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e58440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e58710> <<< 11728 1726882177.49429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11728 1726882177.49561: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.49684: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e59040> <<< 11728 1726882177.49834: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.49850: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e588f0> <<< 11728 1726882177.49872: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e2dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11728 1726882177.49915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11728 1726882177.49967: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e5adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e59af0> <<< 11728 1726882177.50007: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11728 1726882177.50116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11728 1726882177.50172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e87110> <<< 11728 1726882177.50246: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11728 1726882177.50254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.50277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11728 1726882177.50359: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ea74a0> <<< 11728 1726882177.50381: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11728 1726882177.50437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11728 1726882177.50512: stdout chunk (state=3): >>>import 'ntpath' # <<< 11728 1726882177.50542: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.50545: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f08260> <<< 11728 1726882177.50591: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11728 1726882177.50618: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11728 1726882177.50669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11728 1726882177.50797: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f0a9c0> <<< 11728 1726882177.50935: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f08380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ed1280> <<< 11728 1726882177.50968: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d19340> <<< 11728 1726882177.50994: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ea62a0> <<< 11728 1726882177.51000: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e5bce0> <<< 11728 1726882177.51283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1355d195b0> <<< 11728 1726882177.51650: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_2duh6hlc/ansible_setup_payload.zip'<<< 11728 1726882177.51671: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882177.51747: stdout chunk (state=3): >>> <<< 11728 1726882177.51900: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.51946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11728 1726882177.51988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11728 1726882177.52060: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11728 1726882177.52231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d830b0> import '_typing' # <<< 11728 1726882177.52408: stdout chunk (state=3): >>> <<< 11728 1726882177.52636: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d61fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d61160> # zipimport: zlib available import 'ansible' # <<< 11728 1726882177.52642: stdout chunk (state=3): >>> <<< 11728 1726882177.52660: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.52685: stdout chunk (state=3): >>> <<< 11728 1726882177.52708: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.52753: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 11728 1726882177.52848: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.54990: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.56759: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11728 1726882177.56807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d80f80> <<< 11728 1726882177.56834: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 11728 1726882177.56900: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11728 1726882177.56922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 11728 1726882177.56945: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11728 1726882177.57010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.57032: stdout chunk (state=3): >>> # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.57087: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2750> <<< 11728 1726882177.57171: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2060> <<< 11728 1726882177.57196: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11728 1726882177.57497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d83ad0> import 'atexit' # <<< 11728 1726882177.57529: stdout chunk (state=3): >>> # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11728 1726882177.57533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11728 1726882177.57551: stdout chunk (state=3): >>> <<< 11728 1726882177.57618: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db3ec0> import 'pwd' # <<< 11728 1726882177.57653: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11728 1726882177.57710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 11728 1726882177.57762: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135572db50><<< 11728 1726882177.57772: stdout chunk (state=3): >>> <<< 11728 1726882177.57815: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.57839: stdout chunk (state=3): >>> # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135572f7d0><<< 11728 1726882177.57869: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11728 1726882177.57958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557301a0> <<< 11728 1726882177.57997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11728 1726882177.58045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 11728 1726882177.58079: stdout chunk (state=3): >>> import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355731340> <<< 11728 1726882177.58169: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11728 1726882177.58230: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 11728 1726882177.58233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 11728 1726882177.58247: stdout chunk (state=3): >>> <<< 11728 1726882177.58363: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355733d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.58383: stdout chunk (state=3): >>> # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355730110><<< 11728 1726882177.58428: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355732060><<< 11728 1726882177.58443: stdout chunk (state=3): >>> <<< 11728 1726882177.58457: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 11728 1726882177.58517: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 11728 1726882177.58551: stdout chunk (state=3): >>> <<< 11728 1726882177.58554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py<<< 11728 1726882177.58568: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc'<<< 11728 1726882177.58597: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11728 1726882177.58754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 11728 1726882177.58796: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 11728 1726882177.58822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 11728 1726882177.58853: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573baa0><<< 11728 1726882177.58863: stdout chunk (state=3): >>> import '_tokenize' # <<< 11728 1726882177.59002: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a570> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a2d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 11728 1726882177.59012: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11728 1726882177.59180: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a840> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355732540> <<< 11728 1726882177.59229: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.59232: stdout chunk (state=3): >>> # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.59294: stdout chunk (state=3): >>> import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135577fd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 11728 1726882177.59298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882177.59342: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135577fda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 11728 1726882177.59349: stdout chunk (state=3): >>> <<< 11728 1726882177.59368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 11728 1726882177.59414: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 11728 1726882177.59436: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 11728 1726882177.59451: stdout chunk (state=3): >>> <<< 11728 1726882177.59500: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.59503: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.59515: stdout chunk (state=3): >>> import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355781910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557816d0><<< 11728 1726882177.59543: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 11728 1726882177.59587: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11728 1726882177.59675: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.59680: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.59704: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355783e60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355782000><<< 11728 1726882177.59742: stdout chunk (state=3): >>> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 11728 1726882177.59747: stdout chunk (state=3): >>> <<< 11728 1726882177.59838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 11728 1726882177.59842: stdout chunk (state=3): >>> <<< 11728 1726882177.59878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 11728 1726882177.59881: stdout chunk (state=3): >>> import '_string' # <<< 11728 1726882177.59953: stdout chunk (state=3): >>> import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355787620><<< 11728 1726882177.59956: stdout chunk (state=3): >>> <<< 11728 1726882177.60156: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355783fb0> <<< 11728 1726882177.60253: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60256: stdout chunk (state=3): >>> <<< 11728 1726882177.60270: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60320: stdout chunk (state=3): >>> import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13557883e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60350: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.60362: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13557885c0><<< 11728 1726882177.60420: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60455: stdout chunk (state=3): >>> # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.60468: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355788920> <<< 11728 1726882177.60510: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557802c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 11728 1726882177.60520: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 11728 1726882177.60566: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11728 1726882177.60602: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 11728 1726882177.60647: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60659: stdout chunk (state=3): >>> <<< 11728 1726882177.60707: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60710: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135578bf80><<< 11728 1726882177.60946: stdout chunk (state=3): >>> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.60951: stdout chunk (state=3): >>> <<< 11728 1726882177.61023: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355614f50><<< 11728 1726882177.61028: stdout chunk (state=3): >>> <<< 11728 1726882177.61031: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578a750><<< 11728 1726882177.61049: stdout chunk (state=3): >>> <<< 11728 1726882177.61087: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.61113: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135578baa0> <<< 11728 1726882177.61142: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578a390> <<< 11728 1726882177.61357: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.61445: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.61498: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.61514: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 11728 1726882177.61535: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.61572: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.61606: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text' # <<< 11728 1726882177.61623: stdout chunk (state=3): >>> <<< 11728 1726882177.61633: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.61750: stdout chunk (state=3): >>> <<< 11728 1726882177.61836: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.61839: stdout chunk (state=3): >>> <<< 11728 1726882177.62033: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.62036: stdout chunk (state=3): >>> <<< 11728 1726882177.62935: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.62941: stdout chunk (state=3): >>> <<< 11728 1726882177.63820: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11728 1726882177.63861: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 11728 1726882177.63897: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11728 1726882177.63945: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 11728 1726882177.63981: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.64041: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882177.64221: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355619160> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 11728 1726882177.64226: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11728 1726882177.64228: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355619e80><<< 11728 1726882177.64261: stdout chunk (state=3): >>> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578b5f0> <<< 11728 1726882177.64332: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11728 1726882177.64348: stdout chunk (state=3): >>> <<< 11728 1726882177.64372: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.64418: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.64435: stdout chunk (state=3): >>> import 'ansible.module_utils._text' # <<< 11728 1726882177.64458: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882177.64549: stdout chunk (state=3): >>> <<< 11728 1726882177.64721: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.64947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11728 1726882177.64974: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11728 1726882177.65002: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355619ee0> <<< 11728 1726882177.65031: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.65037: stdout chunk (state=3): >>> <<< 11728 1726882177.65782: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882177.65846: stdout chunk (state=3): >>> <<< 11728 1726882177.66536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.66659: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.66769: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11728 1726882177.66789: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.66842: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.67051: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.67144: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11728 1726882177.67175: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.67187: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11728 1726882177.67228: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.67279: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.67335: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11728 1726882177.67357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.67735: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.68109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11728 1726882177.68197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11728 1726882177.68225: stdout chunk (state=3): >>>import '_ast' # <<< 11728 1726882177.68326: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135561b0b0> <<< 11728 1726882177.68348: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.68461: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.68748: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11728 1726882177.68783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.68848: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.68935: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.69033: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11728 1726882177.69103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.69222: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.69246: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355625d30> <<< 11728 1726882177.69319: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355620a40> <<< 11728 1726882177.69370: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 11728 1726882177.69383: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 11728 1726882177.69404: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.69491: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.69622: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.69636: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.69702: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 11728 1726882177.69713: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.69737: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11728 1726882177.69959: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11728 1726882177.70002: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135570e750> <<< 11728 1726882177.70079: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355dde420> <<< 11728 1726882177.70222: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355625f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355625bb0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11728 1726882177.70251: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70288: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70332: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 11728 1726882177.70339: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 11728 1726882177.70418: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11728 1726882177.70447: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70470: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70481: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 11728 1726882177.70513: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70600: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70699: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70725: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70847: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.70880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70934: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.70990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11728 1726882177.71012: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71128: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71280: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71315: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71369: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11728 1726882177.71443: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71665: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71926: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.71983: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.72064: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 11728 1726882177.72067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882177.72163: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11728 1726882177.72180: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b5af0> <<< 11728 1726882177.72215: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11728 1726882177.72229: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11728 1726882177.72286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11728 1726882177.72321: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11728 1726882177.72369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355267e00> <<< 11728 1726882177.72372: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.72391: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526c230> <<< 11728 1726882177.72560: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135569c8c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b41d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b7c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11728 1726882177.72581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11728 1726882177.72615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11728 1726882177.72619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11728 1726882177.72648: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 11728 1726882177.72651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11728 1726882177.72700: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.72705: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526f170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526ea20> <<< 11728 1726882177.72726: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526ec00> <<< 11728 1726882177.72765: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526de50> <<< 11728 1726882177.72858: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11728 1726882177.72947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11728 1726882177.72976: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526f320> <<< 11728 1726882177.73009: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11728 1726882177.73022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11728 1726882177.73045: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.73084: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13552c9e50> <<< 11728 1726882177.73090: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526fe30> <<< 11728 1726882177.73122: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b5280> import 'ansible.module_utils.facts.timeout' # <<< 11728 1726882177.73387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 11728 1726882177.73443: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73489: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11728 1726882177.73511: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73532: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 11728 1726882177.73576: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73621: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11728 1726882177.73630: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73684: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11728 1726882177.73755: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73813: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11728 1726882177.73881: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.73952: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.74034: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.74108: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.74184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 11728 1726882177.74205: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.74947: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.75656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 11728 1726882177.75739: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.75853: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.75874: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 11728 1726882177.75887: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 11728 1726882177.75890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.75933: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.75971: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11728 1726882177.76144: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 11728 1726882177.76172: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11728 1726882177.76218: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76254: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11728 1726882177.76301: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76413: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76540: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py<<< 11728 1726882177.76543: stdout chunk (state=3): >>> <<< 11728 1726882177.76556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11728 1726882177.76584: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552cbe90> <<< 11728 1726882177.76645: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11728 1726882177.76825: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552ca990> import 'ansible.module_utils.facts.system.local' # <<< 11728 1726882177.76846: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.76933: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.77048: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 11728 1726882177.77169: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.77349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11728 1726882177.77396: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.77501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11728 1726882177.77506: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.77568: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.77627: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11728 1726882177.77685: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11728 1726882177.77846: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135530a090> <<< 11728 1726882177.78130: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552eddc0> <<< 11728 1726882177.78142: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 11728 1726882177.78146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.78224: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.78305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11728 1726882177.78308: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.78433: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.78545: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.78951: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 11728 1726882177.78966: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79023: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11728 1726882177.79027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79084: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11728 1726882177.79152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11728 1726882177.79183: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.79215: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.79223: stdout chunk (state=3): >>>import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135531df10> <<< 11728 1726882177.79232: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135531f7d0> import 'ansible.module_utils.facts.system.user' # <<< 11728 1726882177.79249: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 11728 1726882177.79274: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79317: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 11728 1726882177.79382: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.79840: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11728 1726882177.79846: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80000: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80204: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80346: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.80523: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80730: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 11728 1726882177.80738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11728 1726882177.80743: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.80922: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.81107: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11728 1726882177.81112: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.81151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.81203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.82148: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.82859: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11728 1726882177.82876: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.83027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.83178: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 11728 1726882177.83183: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.83334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.83476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11728 1726882177.83490: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.83714: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.84049: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.84098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11728 1726882177.84116: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.84251: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.84399: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.84698: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.84998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11728 1726882177.85018: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85067: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85150: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882177.85172: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11728 1726882177.85191: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85283: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85381: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11728 1726882177.85453: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 11728 1726882177.85456: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85533: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11728 1726882177.85626: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85706: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.85845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11728 1726882177.86206: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.86598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11728 1726882177.86610: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.86689: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.86768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11728 1726882177.86772: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.86823: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11728 1726882177.87141: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87249: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11728 1726882177.87269: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87281: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 11728 1726882177.87302: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87361: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11728 1726882177.87429: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87455: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87479: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87544: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87614: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87710: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 11728 1726882177.87823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 11728 1726882177.87838: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87920: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.87989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 11728 1726882177.88156: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.88298: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.88612: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 11728 1726882177.88663: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.88722: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11728 1726882177.88729: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.88788: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.88947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11728 1726882177.88975: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.89091: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 11728 1726882177.89100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 11728 1726882177.89111: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.89226: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.89349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11728 1726882177.89356: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11728 1726882177.89505: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882177.89703: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11728 1726882177.89736: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11728 1726882177.89748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11728 1726882177.89795: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882177.89801: stdout chunk (state=3): >>>import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135511aab0> <<< 11728 1726882177.89849: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355118770> <<< 11728 1726882177.89881: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13551185c0> <<< 11728 1726882177.91517: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "37", "epoch": "1726882177", "epoch_int": "1726882177", "date": "2024-09-20", "time": "21:29:37", "iso8601_micro": "2024-09-21T01:29:37.901887Z", "iso8601": "2024-09-21T01:29:37Z", "iso8601_basic": "20240920T212937901887", "iso8601_basic_short": "20240920T212937", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_lsb": {}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11728 1726882177.92289: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11728 1726882177.92326: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._<<< 11728 1726882177.92542: stdout chunk (state=3): >>> # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre<<< 11728 1726882177.92566: stdout chunk (state=3): >>> # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib<<< 11728 1726882177.92585: stdout chunk (state=3): >>> # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect<<< 11728 1726882177.92621: stdout chunk (state=3): >>> # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile<<< 11728 1726882177.92639: stdout chunk (state=3): >>> # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils<<< 11728 1726882177.92667: stdout chunk (state=3): >>> # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale<<< 11728 1726882177.92688: stdout chunk (state=3): >>> # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize<<< 11728 1726882177.92737: stdout chunk (state=3): >>> # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string<<< 11728 1726882177.92743: stdout chunk (state=3): >>> # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon<<< 11728 1726882177.92765: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes<<< 11728 1726882177.92805: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 11728 1726882177.92841: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast<<< 11728 1726882177.92845: stdout chunk (state=3): >>> # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale<<< 11728 1726882177.92921: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution<<< 11728 1726882177.93178: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin<<< 11728 1726882177.93191: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme<<< 11728 1726882177.93199: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux<<< 11728 1726882177.93202: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata<<< 11728 1726882177.93352: stdout chunk (state=3): >>> # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11728 1726882177.93686: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 11728 1726882177.93713: stdout chunk (state=3): >>> # destroy importlib.machinery<<< 11728 1726882177.93738: stdout chunk (state=3): >>> # destroy importlib._abc <<< 11728 1726882177.93788: stdout chunk (state=3): >>># destroy importlib.util # destroy _bz2<<< 11728 1726882177.93821: stdout chunk (state=3): >>> # destroy _compression # destroy _lzma <<< 11728 1726882177.93857: stdout chunk (state=3): >>># destroy _blake2 <<< 11728 1726882177.93896: stdout chunk (state=3): >>># destroy binascii # destroy zlib <<< 11728 1726882177.93920: stdout chunk (state=3): >>># destroy bz2 # destroy lzma<<< 11728 1726882177.93944: stdout chunk (state=3): >>> # destroy zipfile._path<<< 11728 1726882177.93981: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib<<< 11728 1726882177.93999: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress<<< 11728 1726882177.94059: stdout chunk (state=3): >>> # destroy ntpath<<< 11728 1726882177.94078: stdout chunk (state=3): >>> # destroy importlib<<< 11728 1726882177.94102: stdout chunk (state=3): >>> # destroy zipimport # destroy __main__<<< 11728 1726882177.94126: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 11728 1726882177.94147: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner # destroy _json<<< 11728 1726882177.94158: stdout chunk (state=3): >>> # destroy grp # destroy encodings <<< 11728 1726882177.94199: stdout chunk (state=3): >>># destroy _locale # destroy locale<<< 11728 1726882177.94220: stdout chunk (state=3): >>> # destroy select # destroy _signal # destroy _posixsubprocess<<< 11728 1726882177.94231: stdout chunk (state=3): >>> # destroy syslog<<< 11728 1726882177.94299: stdout chunk (state=3): >>> # destroy uuid # destroy selinux<<< 11728 1726882177.94326: stdout chunk (state=3): >>> # destroy shutil <<< 11728 1726882177.94348: stdout chunk (state=3): >>># destroy distro # destroy distro.distro<<< 11728 1726882177.94434: stdout chunk (state=3): >>> # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 11728 1726882177.94465: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing <<< 11728 1726882177.94489: stdout chunk (state=3): >>># destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal<<< 11728 1726882177.94520: stdout chunk (state=3): >>> # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 11728 1726882177.94557: stdout chunk (state=3): >>># destroy queue<<< 11728 1726882177.94581: stdout chunk (state=3): >>> # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata<<< 11728 1726882177.94613: stdout chunk (state=3): >>> # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors<<< 11728 1726882177.94651: stdout chunk (state=3): >>> # destroy _multiprocessing # destroy shlex # destroy fcntl<<< 11728 1726882177.94666: stdout chunk (state=3): >>> # destroy datetime<<< 11728 1726882177.94675: stdout chunk (state=3): >>> # destroy subprocess # destroy base64<<< 11728 1726882177.94712: stdout chunk (state=3): >>> # destroy _ssl<<< 11728 1726882177.94742: stdout chunk (state=3): >>> <<< 11728 1726882177.94753: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass<<< 11728 1726882177.94783: stdout chunk (state=3): >>> # destroy pwd # destroy termios # destroy errno<<< 11728 1726882177.94838: stdout chunk (state=3): >>> # destroy json # destroy socket # destroy struct<<< 11728 1726882177.94856: stdout chunk (state=3): >>> <<< 11728 1726882177.94859: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 11728 1726882177.94981: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser<<< 11728 1726882177.95002: stdout chunk (state=3): >>> # cleanup[3] wiping selinux._selinux<<< 11728 1726882177.95132: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback<<< 11728 1726882177.95166: stdout chunk (state=3): >>> # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 11728 1726882177.95182: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator<<< 11728 1726882177.95218: stdout chunk (state=3): >>> # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 11728 1726882177.95283: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io<<< 11728 1726882177.95287: stdout chunk (state=3): >>> # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 11728 1726882177.95297: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal<<< 11728 1726882177.95411: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread<<< 11728 1726882177.95415: stdout chunk (state=3): >>> # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128<<< 11728 1726882177.95427: stdout chunk (state=3): >>> # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11728 1726882177.95595: stdout chunk (state=3): >>># destroy sys.monitoring<<< 11728 1726882177.95611: stdout chunk (state=3): >>> # destroy _socket <<< 11728 1726882177.95842: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 11728 1726882177.95845: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 11728 1726882177.95872: stdout chunk (state=3): >>> # destroy _frozen_importlib_external <<< 11728 1726882177.96049: stdout chunk (state=3): >>># destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases<<< 11728 1726882177.96113: stdout chunk (state=3): >>> # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io<<< 11728 1726882177.96151: stdout chunk (state=3): >>> # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings <<< 11728 1726882177.96283: stdout chunk (state=3): >>># destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string<<< 11728 1726882177.96380: stdout chunk (state=3): >>> # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks<<< 11728 1726882177.96548: stdout chunk (state=3): >>> <<< 11728 1726882177.97041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882177.97051: stdout chunk (state=3): >>><<< 11728 1726882177.97062: stderr chunk (state=3): >>><<< 11728 1726882177.97522: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13562184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135621aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135602d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135602dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135606be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135606bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356083b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356081280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356069040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356082150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560c0c20> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f8860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560682c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13560f8d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f8bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13560f8f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356066de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f9610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560f92e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fa510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356110710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1356111df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1356112c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13561132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561121e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1356113d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13561134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fa540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e2fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e586e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e58440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e58710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e59040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355e599a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e588f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e2dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e5adb0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e59af0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13560fac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e87110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ea74a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f08260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f0a9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355f08380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ed1280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d19340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355ea62a0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355e5bce0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f1355d195b0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_2duh6hlc/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d830b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d61fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d61160> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d80f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db29c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db2ab0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355d83ad0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db3740> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355db3980> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355db3ec0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135572db50> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135572f7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557301a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355731340> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355733d70> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355730110> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355732060> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573baa0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a570> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a2d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135573a840> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355732540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135577fd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135577fda0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355781910> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557816d0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355783e60> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355782000> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355787620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355783fb0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13557883e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13557885c0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355788920> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13557802c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135578bf80> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355614f50> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135578baa0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578a390> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355619160> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355619e80> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135578b5f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355619ee0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135561b0b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f1355625d30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355620a40> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135570e750> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355dde420> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355625f10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355625bb0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b5af0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355267e00> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526c230> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135569c8c0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b6690> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b41d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b7c80> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526f170> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526ea20> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135526ec00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526de50> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526f320> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13552c9e50> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135526fe30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13556b5280> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552cbe90> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552ca990> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135530a090> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13552eddc0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135531df10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f135531f7d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f135511aab0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f1355118770> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13551185c0> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "37", "epoch": "1726882177", "epoch_int": "1726882177", "date": "2024-09-20", "time": "21:29:37", "iso8601_micro": "2024-09-21T01:29:37.901887Z", "iso8601": "2024-09-21T01:29:37Z", "iso8601_basic": "20240920T212937901887", "iso8601_basic_short": "20240920T212937", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_lsb": {}, "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11728 1726882177.99219: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882177.99223: _low_level_execute_command(): starting 11728 1726882177.99226: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882177.3061981-11863-156945425246161/ > /dev/null 2>&1 && sleep 0' 11728 1726882177.99322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882177.99326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882177.99328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.99338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882177.99384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882177.99609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882177.99662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882177.99697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.02402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.02406: stdout chunk (state=3): >>><<< 11728 1726882178.02409: stderr chunk (state=3): >>><<< 11728 1726882178.02411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882178.02413: handler run complete 11728 1726882178.02438: variable 'ansible_facts' from source: unknown 11728 1726882178.02491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.02904: variable 'ansible_facts' from source: unknown 11728 1726882178.02907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.03124: attempt loop complete, returning result 11728 1726882178.03136: _execute() done 11728 1726882178.03139: dumping result to json 11728 1726882178.03141: done dumping result, returning 11728 1726882178.03143: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-5c28-a762-000000000026] 11728 1726882178.03149: sending task result for task 12673a56-9f93-5c28-a762-000000000026 11728 1726882178.03415: done sending task result for task 12673a56-9f93-5c28-a762-000000000026 11728 1726882178.03418: WORKER PROCESS EXITING ok: [managed_node3] 11728 1726882178.03513: no more pending results, returning what we have 11728 1726882178.03515: results queue empty 11728 1726882178.03516: checking for any_errors_fatal 11728 1726882178.03517: done checking for any_errors_fatal 11728 1726882178.03518: checking for max_fail_percentage 11728 1726882178.03520: done checking for max_fail_percentage 11728 1726882178.03520: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.03521: done checking to see if all hosts have failed 11728 1726882178.03522: getting the remaining hosts for this loop 11728 1726882178.03523: done getting the remaining hosts for this loop 11728 1726882178.03528: getting the next task for host managed_node3 11728 1726882178.03535: done getting next task for host managed_node3 11728 1726882178.03537: ^ task is: TASK: Check if system is ostree 11728 1726882178.03539: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.03543: getting variables 11728 1726882178.03544: in VariableManager get_vars() 11728 1726882178.03570: Calling all_inventory to load vars for managed_node3 11728 1726882178.03573: Calling groups_inventory to load vars for managed_node3 11728 1726882178.03576: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.03586: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.03589: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.03592: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.03878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.04084: done with get_vars() 11728 1726882178.04098: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:29:38 -0400 (0:00:00.862) 0:00:02.894 ****** 11728 1726882178.04204: entering _queue_task() for managed_node3/stat 11728 1726882178.04513: worker is 1 (out of 1 available) 11728 1726882178.04528: exiting _queue_task() for managed_node3/stat 11728 1726882178.04540: done queuing things up, now waiting for results queue to drain 11728 1726882178.04542: waiting for pending results... 11728 1726882178.05011: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 11728 1726882178.05016: in run() - task 12673a56-9f93-5c28-a762-000000000028 11728 1726882178.05019: variable 'ansible_search_path' from source: unknown 11728 1726882178.05022: variable 'ansible_search_path' from source: unknown 11728 1726882178.05025: calling self._execute() 11728 1726882178.05080: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.05092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.05109: variable 'omit' from source: magic vars 11728 1726882178.05559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882178.05838: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882178.06101: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882178.06104: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882178.06107: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882178.06247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882178.06329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882178.06359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882178.06433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882178.06716: Evaluated conditional (not __network_is_ostree is defined): True 11728 1726882178.06727: variable 'omit' from source: magic vars 11728 1726882178.06766: variable 'omit' from source: magic vars 11728 1726882178.06836: variable 'omit' from source: magic vars 11728 1726882178.07001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882178.07028: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882178.07051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882178.07080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882178.07098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882178.07130: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882178.07169: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.07206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.07303: Set connection var ansible_connection to ssh 11728 1726882178.07319: Set connection var ansible_shell_executable to /bin/sh 11728 1726882178.07329: Set connection var ansible_timeout to 10 11728 1726882178.07335: Set connection var ansible_shell_type to sh 11728 1726882178.07357: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882178.07366: Set connection var ansible_pipelining to False 11728 1726882178.07392: variable 'ansible_shell_executable' from source: unknown 11728 1726882178.07402: variable 'ansible_connection' from source: unknown 11728 1726882178.07409: variable 'ansible_module_compression' from source: unknown 11728 1726882178.07415: variable 'ansible_shell_type' from source: unknown 11728 1726882178.07421: variable 'ansible_shell_executable' from source: unknown 11728 1726882178.07427: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.07433: variable 'ansible_pipelining' from source: unknown 11728 1726882178.07439: variable 'ansible_timeout' from source: unknown 11728 1726882178.07598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.07605: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882178.07609: variable 'omit' from source: magic vars 11728 1726882178.07611: starting attempt loop 11728 1726882178.07614: running the handler 11728 1726882178.07616: _low_level_execute_command(): starting 11728 1726882178.07619: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882178.08259: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882178.08269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.08280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882178.08386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882178.08420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.08469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.10797: stdout chunk (state=3): >>>/root <<< 11728 1726882178.11053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.11058: stdout chunk (state=3): >>><<< 11728 1726882178.11061: stderr chunk (state=3): >>><<< 11728 1726882178.11180: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882178.11192: _low_level_execute_command(): starting 11728 1726882178.11200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814 `" && echo ansible-tmp-1726882178.1108963-11894-255983093204814="` echo /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814 `" ) && sleep 0' 11728 1726882178.12149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.12187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.14919: stdout chunk (state=3): >>>ansible-tmp-1726882178.1108963-11894-255983093204814=/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814 <<< 11728 1726882178.15054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.15064: stdout chunk (state=3): >>><<< 11728 1726882178.15407: stderr chunk (state=3): >>><<< 11728 1726882178.15411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882178.1108963-11894-255983093204814=/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882178.15414: variable 'ansible_module_compression' from source: unknown 11728 1726882178.15446: ANSIBALLZ: Using lock for stat 11728 1726882178.15453: ANSIBALLZ: Acquiring lock 11728 1726882178.15461: ANSIBALLZ: Lock acquired: 139840770724960 11728 1726882178.15468: ANSIBALLZ: Creating module 11728 1726882178.33016: ANSIBALLZ: Writing module into payload 11728 1726882178.33126: ANSIBALLZ: Writing module 11728 1726882178.33160: ANSIBALLZ: Renaming module 11728 1726882178.33171: ANSIBALLZ: Done creating module 11728 1726882178.33199: variable 'ansible_facts' from source: unknown 11728 1726882178.33299: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py 11728 1726882178.33494: Sending initial data 11728 1726882178.33498: Sent initial data (153 bytes) 11728 1726882178.34123: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882178.34136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.34150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882178.34241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.34280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882178.34305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882178.34321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.34414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.36686: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882178.36755: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882178.36832: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpsw65cpe5 /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py <<< 11728 1726882178.36848: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py" <<< 11728 1726882178.36888: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11728 1726882178.36905: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpsw65cpe5" to remote "/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py" <<< 11728 1726882178.37649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.37708: stderr chunk (state=3): >>><<< 11728 1726882178.37759: stdout chunk (state=3): >>><<< 11728 1726882178.37763: done transferring module to remote 11728 1726882178.37870: _low_level_execute_command(): starting 11728 1726882178.37873: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/ /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py && sleep 0' 11728 1726882178.38472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882178.38481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.38496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882178.38515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882178.38526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882178.38535: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882178.38544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.38564: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882178.38571: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882178.38578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882178.38634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.38637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882178.38639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882178.38666: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.38694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882178.38709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882178.38729: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.38826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.41404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.41408: stdout chunk (state=3): >>><<< 11728 1726882178.41411: stderr chunk (state=3): >>><<< 11728 1726882178.41427: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882178.41435: _low_level_execute_command(): starting 11728 1726882178.41513: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/AnsiballZ_stat.py && sleep 0' 11728 1726882178.42120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.42212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882178.42265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882178.42268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.42366: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.45461: stdout chunk (state=3): >>>import _frozen_importlib # frozen<<< 11728 1726882178.45471: stdout chunk (state=3): >>> <<< 11728 1726882178.45523: stdout chunk (state=3): >>>import _imp # builtin<<< 11728 1726882178.45530: stdout chunk (state=3): >>> <<< 11728 1726882178.45572: stdout chunk (state=3): >>>import '_thread' # <<< 11728 1726882178.45605: stdout chunk (state=3): >>> import '_warnings' # <<< 11728 1726882178.45615: stdout chunk (state=3): >>>import '_weakref' # <<< 11728 1726882178.45759: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11728 1726882178.45808: stdout chunk (state=3): >>>import 'posix' # <<< 11728 1726882178.45811: stdout chunk (state=3): >>> <<< 11728 1726882178.45880: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook<<< 11728 1726882178.45929: stdout chunk (state=3): >>> import 'time' # <<< 11728 1726882178.45965: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11728 1726882178.46027: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.46077: stdout chunk (state=3): >>> import '_codecs' # <<< 11728 1726882178.46171: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 11728 1726882178.46206: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11728 1726882178.46234: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd26184d0><<< 11728 1726882178.46242: stdout chunk (state=3): >>> <<< 11728 1726882178.46301: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd25e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 11728 1726882178.46324: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd261aa50><<< 11728 1726882178.46502: stdout chunk (state=3): >>> import '_signal' # <<< 11728 1726882178.46518: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # <<< 11728 1726882178.46559: stdout chunk (state=3): >>> import 'stat' # <<< 11728 1726882178.46654: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11728 1726882178.46714: stdout chunk (state=3): >>>import 'genericpath' # <<< 11728 1726882178.46738: stdout chunk (state=3): >>>import 'posixpath' # <<< 11728 1726882178.46775: stdout chunk (state=3): >>> import 'os' # <<< 11728 1726882178.46803: stdout chunk (state=3): >>> <<< 11728 1726882178.46836: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11728 1726882178.46864: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages'<<< 11728 1726882178.46878: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 11728 1726882178.46908: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 11728 1726882178.47069: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd23c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882178.47080: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd23c9fa0> <<< 11728 1726882178.47119: stdout chunk (state=3): >>>import 'site' # <<< 11728 1726882178.47181: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 11728 1726882178.47199: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 11728 1726882178.47560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 11728 1726882178.47566: stdout chunk (state=3): >>> <<< 11728 1726882178.47591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 11728 1726882178.47598: stdout chunk (state=3): >>> <<< 11728 1726882178.47626: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 11728 1726882178.47634: stdout chunk (state=3): >>> <<< 11728 1726882178.47655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.47663: stdout chunk (state=3): >>> <<< 11728 1726882178.47696: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 11728 1726882178.47703: stdout chunk (state=3): >>> <<< 11728 1726882178.47784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 11728 1726882178.47789: stdout chunk (state=3): >>> <<< 11728 1726882178.47834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 11728 1726882178.47861: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2407e60><<< 11728 1726882178.47899: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py<<< 11728 1726882178.47905: stdout chunk (state=3): >>> <<< 11728 1726882178.47930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 11728 1726882178.47971: stdout chunk (state=3): >>> import '_operator' # <<< 11728 1726882178.47979: stdout chunk (state=3): >>> <<< 11728 1726882178.48001: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2407f20><<< 11728 1726882178.48034: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 11728 1726882178.48039: stdout chunk (state=3): >>> <<< 11728 1726882178.48112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 11728 1726882178.48117: stdout chunk (state=3): >>> <<< 11728 1726882178.48186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.48224: stdout chunk (state=3): >>> import 'itertools' # <<< 11728 1726882178.48227: stdout chunk (state=3): >>> <<< 11728 1726882178.48263: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 11728 1726882178.48310: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd243f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 11728 1726882178.48313: stdout chunk (state=3): >>> <<< 11728 1726882178.48329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 11728 1726882178.48353: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd243ff20><<< 11728 1726882178.48375: stdout chunk (state=3): >>> import '_collections' # <<< 11728 1726882178.48380: stdout chunk (state=3): >>> <<< 11728 1726882178.48467: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241fb30> import '_functools' # <<< 11728 1726882178.48523: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241d250> <<< 11728 1726882178.48688: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2405010> <<< 11728 1726882178.48762: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11728 1726882178.48782: stdout chunk (state=3): >>>import '_sre' # <<< 11728 1726882178.48816: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11728 1726882178.48875: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 11728 1726882178.48914: stdout chunk (state=3): >>> # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11728 1726882178.48921: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 11728 1726882178.49005: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245e450> <<< 11728 1726882178.49034: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 11728 1726882178.49051: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 11728 1726882178.49078: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245ccb0><<< 11728 1726882178.49165: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 11728 1726882178.49169: stdout chunk (state=3): >>> <<< 11728 1726882178.49173: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 11728 1726882178.49175: stdout chunk (state=3): >>> <<< 11728 1726882178.49227: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2494860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2404290> <<< 11728 1726882178.49231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 11728 1726882178.49240: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 11728 1726882178.49302: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.49323: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.49351: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2494d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2494bc0><<< 11728 1726882178.49366: stdout chunk (state=3): >>> <<< 11728 1726882178.49401: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.49431: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2494fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2402db0><<< 11728 1726882178.49481: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 11728 1726882178.49484: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882178.49515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 11728 1726882178.49563: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11728 1726882178.49596: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24956a0> <<< 11728 1726882178.49627: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2495370> import 'importlib.machinery' # <<< 11728 1726882178.49717: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24965a0> <<< 11728 1726882178.49776: stdout chunk (state=3): >>>import 'importlib.util' # <<< 11728 1726882178.49778: stdout chunk (state=3): >>> import 'runpy' # <<< 11728 1726882178.49819: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11728 1726882178.49869: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 11728 1726882178.49905: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 11728 1726882178.49912: stdout chunk (state=3): >>> <<< 11728 1726882178.49935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24ac7a0><<< 11728 1726882178.49957: stdout chunk (state=3): >>> import 'errno' # <<< 11728 1726882178.49997: stdout chunk (state=3): >>> # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.50025: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50031: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24ade80><<< 11728 1726882178.50069: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11728 1726882178.50120: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 11728 1726882178.50147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 11728 1726882178.50165: stdout chunk (state=3): >>> <<< 11728 1726882178.50219: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.50244: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50248: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24af320><<< 11728 1726882178.50274: stdout chunk (state=3): >>> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24ae270> <<< 11728 1726882178.50322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 11728 1726882178.50378: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50383: stdout chunk (state=3): >>> <<< 11728 1726882178.50405: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50427: stdout chunk (state=3): >>> import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24afda0><<< 11728 1726882178.50433: stdout chunk (state=3): >>> <<< 11728 1726882178.50518: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2496510><<< 11728 1726882178.50524: stdout chunk (state=3): >>> <<< 11728 1726882178.50566: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 11728 1726882178.50571: stdout chunk (state=3): >>> <<< 11728 1726882178.50615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 11728 1726882178.50618: stdout chunk (state=3): >>> <<< 11728 1726882178.50653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py<<< 11728 1726882178.50691: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11728 1726882178.50746: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50749: stdout chunk (state=3): >>> # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50792: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2243bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 11728 1726882178.50810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 11728 1726882178.50850: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50863: stdout chunk (state=3): >>> <<< 11728 1726882178.50866: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.50911: stdout chunk (state=3): >>>import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.50931: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.50933: stdout chunk (state=3): >>> <<< 11728 1726882178.50985: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11728 1726882178.50998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 11728 1726882178.51004: stdout chunk (state=3): >>> <<< 11728 1726882178.51096: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.51294: stdout chunk (state=3): >>> # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.51297: stdout chunk (state=3): >>> <<< 11728 1726882178.51301: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226cfe0> <<< 11728 1726882178.51485: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.51490: stdout chunk (state=3): >>> <<< 11728 1726882178.51527: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.51560: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226d910> <<< 11728 1726882178.51564: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226c8c0> <<< 11728 1726882178.51612: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2241d90> <<< 11728 1726882178.51651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11728 1726882178.51698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 11728 1726882178.51712: stdout chunk (state=3): >>> <<< 11728 1726882178.51738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py<<< 11728 1726882178.51758: stdout chunk (state=3): >>> <<< 11728 1726882178.51788: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 11728 1726882178.51805: stdout chunk (state=3): >>> import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226ed20> <<< 11728 1726882178.51850: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226da60> <<< 11728 1726882178.51917: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2496750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 11728 1726882178.51933: stdout chunk (state=3): >>> <<< 11728 1726882178.52018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882178.52100: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 11728 1726882178.52103: stdout chunk (state=3): >>> <<< 11728 1726882178.52150: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2297080><<< 11728 1726882178.52158: stdout chunk (state=3): >>> <<< 11728 1726882178.52236: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 11728 1726882178.52240: stdout chunk (state=3): >>> <<< 11728 1726882178.52302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11728 1726882178.52336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 11728 1726882178.52413: stdout chunk (state=3): >>> import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22bb440> <<< 11728 1726882178.52460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11728 1726882178.52528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 11728 1726882178.52626: stdout chunk (state=3): >>> import 'ntpath' # <<< 11728 1726882178.52673: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 11728 1726882178.52707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231c230> <<< 11728 1726882178.52750: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 11728 1726882178.52753: stdout chunk (state=3): >>> <<< 11728 1726882178.52818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11728 1726882178.52903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11728 1726882178.53034: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231e990><<< 11728 1726882178.53049: stdout chunk (state=3): >>> <<< 11728 1726882178.53153: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231c350><<< 11728 1726882178.53158: stdout chunk (state=3): >>> <<< 11728 1726882178.53214: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22e9250><<< 11728 1726882178.53218: stdout chunk (state=3): >>> <<< 11728 1726882178.53255: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd211d310><<< 11728 1726882178.53283: stdout chunk (state=3): >>> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22ba240> <<< 11728 1726882178.53449: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226fc50> <<< 11728 1726882178.53500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 11728 1726882178.53506: stdout chunk (state=3): >>> <<< 11728 1726882178.53603: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8fd211d5b0> <<< 11728 1726882178.53966: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_jwexsqan/ansible_stat_payload.zip' <<< 11728 1726882178.54145: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.54211: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.54269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 11728 1726882178.54272: stdout chunk (state=3): >>> <<< 11728 1726882178.54303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 11728 1726882178.54307: stdout chunk (state=3): >>> <<< 11728 1726882178.54374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11728 1726882178.54498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11728 1726882178.54542: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 11728 1726882178.54555: stdout chunk (state=3): >>> <<< 11728 1726882178.54568: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 11728 1726882178.54580: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2172fc0> <<< 11728 1726882178.54602: stdout chunk (state=3): >>>import '_typing' # <<< 11728 1726882178.54617: stdout chunk (state=3): >>> <<< 11728 1726882178.54884: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2151eb0> <<< 11728 1726882178.54887: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21510a0><<< 11728 1726882178.54907: stdout chunk (state=3): >>> <<< 11728 1726882178.54956: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # <<< 11728 1726882178.54962: stdout chunk (state=3): >>> <<< 11728 1726882178.54986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.55001: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.55034: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882178.55075: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 11728 1726882178.55159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.57291: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.59116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11728 1726882178.59139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc'<<< 11728 1726882178.59145: stdout chunk (state=3): >>> <<< 11728 1726882178.59162: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21712b0><<< 11728 1726882178.59202: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py<<< 11728 1726882178.59210: stdout chunk (state=3): >>> <<< 11728 1726882178.59216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.59259: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11728 1726882178.59285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc'<<< 11728 1726882178.59320: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py<<< 11728 1726882178.59328: stdout chunk (state=3): >>> <<< 11728 1726882178.59338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 11728 1726882178.59388: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.59411: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.59479: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219e960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e6f0><<< 11728 1726882178.59485: stdout chunk (state=3): >>> <<< 11728 1726882178.59537: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e030><<< 11728 1726882178.59542: stdout chunk (state=3): >>> <<< 11728 1726882178.59579: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11728 1726882178.59609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 11728 1726882178.59643: stdout chunk (state=3): >>> <<< 11728 1726882178.59680: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e480> <<< 11728 1726882178.59717: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2173c50> import 'atexit' # <<< 11728 1726882178.59773: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219f710><<< 11728 1726882178.59815: stdout chunk (state=3): >>> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.59825: stdout chunk (state=3): >>> <<< 11728 1726882178.59834: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219f950><<< 11728 1726882178.59872: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11728 1726882178.59968: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11728 1726882178.59972: stdout chunk (state=3): >>> <<< 11728 1726882178.60052: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219fe90> import 'pwd' # <<< 11728 1726882178.60087: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py<<< 11728 1726882178.60095: stdout chunk (state=3): >>> <<< 11728 1726882178.60126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc'<<< 11728 1726882178.60180: stdout chunk (state=3): >>> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b0dc10> <<< 11728 1726882178.60225: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.60246: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.60284: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b0f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11728 1726882178.60315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 11728 1726882178.60374: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b101d0> <<< 11728 1726882178.60410: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py<<< 11728 1726882178.60414: stdout chunk (state=3): >>> <<< 11728 1726882178.60460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc'<<< 11728 1726882178.60465: stdout chunk (state=3): >>> <<< 11728 1726882178.60514: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b11370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 11728 1726882178.60519: stdout chunk (state=3): >>> <<< 11728 1726882178.60567: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 11728 1726882178.60599: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 11728 1726882178.60620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc'<<< 11728 1726882178.60713: stdout chunk (state=3): >>> import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b13e30><<< 11728 1726882178.60718: stdout chunk (state=3): >>> <<< 11728 1726882178.60785: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.60797: stdout chunk (state=3): >>> <<< 11728 1726882178.60811: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.60822: stdout chunk (state=3): >>> import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd21707d0> <<< 11728 1726882178.60870: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b120f0><<< 11728 1726882178.60873: stdout chunk (state=3): >>> <<< 11728 1726882178.60939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc'<<< 11728 1726882178.60943: stdout chunk (state=3): >>> <<< 11728 1726882178.60983: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 11728 1726882178.61025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py<<< 11728 1726882178.61030: stdout chunk (state=3): >>> <<< 11728 1726882178.61108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 11728 1726882178.61119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 11728 1726882178.61142: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1be30><<< 11728 1726882178.61169: stdout chunk (state=3): >>> import '_tokenize' # <<< 11728 1726882178.61273: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1a660><<< 11728 1726882178.61312: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py<<< 11728 1726882178.61317: stdout chunk (state=3): >>> <<< 11728 1726882178.61451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11728 1726882178.61462: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1abd0><<< 11728 1726882178.61464: stdout chunk (state=3): >>> <<< 11728 1726882178.61514: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b12600> <<< 11728 1726882178.61561: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.61598: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.61601: stdout chunk (state=3): >>> <<< 11728 1726882178.61608: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b63e90><<< 11728 1726882178.61614: stdout chunk (state=3): >>> <<< 11728 1726882178.61654: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 11728 1726882178.61670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.61684: stdout chunk (state=3): >>> <<< 11728 1726882178.61690: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b641a0><<< 11728 1726882178.61730: stdout chunk (state=3): >>> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 11728 1726882178.61738: stdout chunk (state=3): >>> <<< 11728 1726882178.61770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc'<<< 11728 1726882178.61775: stdout chunk (state=3): >>> <<< 11728 1726882178.61803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py <<< 11728 1726882178.61868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.61885: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.61895: stdout chunk (state=3): >>>import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b65c40> <<< 11728 1726882178.61937: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b65a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 11728 1726882178.61940: stdout chunk (state=3): >>> <<< 11728 1726882178.62157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.62185: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.62202: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b68200> <<< 11728 1726882178.62205: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b66330><<< 11728 1726882178.62220: stdout chunk (state=3): >>> <<< 11728 1726882178.62247: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 11728 1726882178.62398: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6b9e0> <<< 11728 1726882178.62600: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b683b0><<< 11728 1726882178.62607: stdout chunk (state=3): >>> <<< 11728 1726882178.62704: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.62773: stdout chunk (state=3): >>> # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6ca70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.62784: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.62804: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6cb00><<< 11728 1726882178.62864: stdout chunk (state=3): >>> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.62899: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.62919: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6cd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b64350><<< 11728 1726882178.62974: stdout chunk (state=3): >>> <<< 11728 1726882178.63003: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc'<<< 11728 1726882178.63026: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11728 1726882178.63076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc'<<< 11728 1726882178.63113: stdout chunk (state=3): >>> # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.63152: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.63250: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bf4470> <<< 11728 1726882178.63439: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.63484: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bf55b0> <<< 11728 1726882178.63512: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6ec30> <<< 11728 1726882178.63546: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.63597: stdout chunk (state=3): >>> import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6ff80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6e810><<< 11728 1726882178.63638: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available <<< 11728 1726882178.63675: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11728 1726882178.63805: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.63929: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882178.63936: stdout chunk (state=3): >>> <<< 11728 1726882178.63961: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.63976: stdout chunk (state=3): >>> <<< 11728 1726882178.63980: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 11728 1726882178.63992: stdout chunk (state=3): >>> <<< 11728 1726882178.64014: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.64039: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.64053: stdout chunk (state=3): >>> <<< 11728 1726882178.64056: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 11728 1726882178.64062: stdout chunk (state=3): >>> <<< 11728 1726882178.64089: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.64096: stdout chunk (state=3): >>> <<< 11728 1726882178.64276: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.64282: stdout chunk (state=3): >>> <<< 11728 1726882178.64484: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.64650: stdout chunk (state=3): >>> <<< 11728 1726882178.65410: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.65412: stdout chunk (state=3): >>> <<< 11728 1726882178.66318: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11728 1726882178.66342: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves' # <<< 11728 1726882178.66396: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 11728 1726882178.66399: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 11728 1726882178.66436: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py<<< 11728 1726882178.66439: stdout chunk (state=3): >>> <<< 11728 1726882178.66478: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882178.66628: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bfd8e0><<< 11728 1726882178.66712: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 11728 1726882178.66743: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11728 1726882178.66768: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bfed80> <<< 11728 1726882178.66797: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bf5760> <<< 11728 1726882178.66870: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11728 1726882178.66899: stdout chunk (state=3): >>> <<< 11728 1726882178.66910: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.66963: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.66984: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 11728 1726882178.67011: stdout chunk (state=3): >>> # zipimport: zlib available<<< 11728 1726882178.67152: stdout chunk (state=3): >>> <<< 11728 1726882178.67251: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.67524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11728 1726882178.67547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bfede0> <<< 11728 1726882178.67572: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.68368: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.69087: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.69206: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69212: stdout chunk (state=3): >>> <<< 11728 1726882178.69322: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11728 1726882178.69327: stdout chunk (state=3): >>> <<< 11728 1726882178.69353: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69358: stdout chunk (state=3): >>> <<< 11728 1726882178.69455: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 11728 1726882178.69461: stdout chunk (state=3): >>> <<< 11728 1726882178.69483: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69489: stdout chunk (state=3): >>> <<< 11728 1726882178.69590: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69726: stdout chunk (state=3): >>> import 'ansible.module_utils.errors' # <<< 11728 1726882178.69757: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.69796: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.69818: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # <<< 11728 1726882178.69855: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69859: stdout chunk (state=3): >>> <<< 11728 1726882178.69928: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.69933: stdout chunk (state=3): >>> <<< 11728 1726882178.69997: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11728 1726882178.70026: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.70032: stdout chunk (state=3): >>> <<< 11728 1726882178.70411: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.70651: stdout chunk (state=3): >>> <<< 11728 1726882178.70830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11728 1726882178.70919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11728 1726882178.70963: stdout chunk (state=3): >>>import '_ast' # <<< 11728 1726882178.71089: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bffa10> <<< 11728 1726882178.71124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.71222: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.71228: stdout chunk (state=3): >>> <<< 11728 1726882178.71332: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11728 1726882178.71343: stdout chunk (state=3): >>> <<< 11728 1726882178.71355: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # <<< 11728 1726882178.71367: stdout chunk (state=3): >>> <<< 11728 1726882178.71372: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # <<< 11728 1726882178.71403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 11728 1726882178.71410: stdout chunk (state=3): >>> <<< 11728 1726882178.71436: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.71441: stdout chunk (state=3): >>> <<< 11728 1726882178.71509: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.71516: stdout chunk (state=3): >>> <<< 11728 1726882178.71571: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11728 1726882178.71578: stdout chunk (state=3): >>> <<< 11728 1726882178.71598: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.71607: stdout chunk (state=3): >>> <<< 11728 1726882178.71688: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.71753: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.71781: stdout chunk (state=3): >>> <<< 11728 1726882178.71940: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11728 1726882178.72020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11728 1726882178.72152: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so'<<< 11728 1726882178.72174: stdout chunk (state=3): >>> # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 11728 1726882178.72203: stdout chunk (state=3): >>>import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1a0a180> <<< 11728 1726882178.72252: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1a074d0><<< 11728 1726882178.72263: stdout chunk (state=3): >>> <<< 11728 1726882178.72309: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 11728 1726882178.72332: stdout chunk (state=3): >>> import 'ansible.module_utils.common.process' # # zipimport: zlib available<<< 11728 1726882178.72434: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11728 1726882178.72531: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.72538: stdout chunk (state=3): >>> <<< 11728 1726882178.72587: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.72592: stdout chunk (state=3): >>> <<< 11728 1726882178.72660: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py<<< 11728 1726882178.72675: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc'<<< 11728 1726882178.72688: stdout chunk (state=3): >>> <<< 11728 1726882178.72721: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 11728 1726882178.72727: stdout chunk (state=3): >>> <<< 11728 1726882178.72768: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 11728 1726882178.72802: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py<<< 11728 1726882178.72808: stdout chunk (state=3): >>> <<< 11728 1726882178.72920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py<<< 11728 1726882178.72925: stdout chunk (state=3): >>> <<< 11728 1726882178.73037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21f6960><<< 11728 1726882178.73042: stdout chunk (state=3): >>> <<< 11728 1726882178.73136: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21e6660> <<< 11728 1726882178.73244: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6dc70><<< 11728 1726882178.73271: stdout chunk (state=3): >>> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bff1d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11728 1726882178.73315: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11728 1726882178.73348: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.73472: stdout chunk (state=3): >>> import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 11728 1726882178.73524: stdout chunk (state=3): >>> <<< 11728 1726882178.73532: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 11728 1726882178.73565: stdout chunk (state=3): >>> import 'ansible.modules' # <<< 11728 1726882178.73579: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11728 1726882178.73806: stdout chunk (state=3): >>># zipimport: zlib available <<< 11728 1726882178.74114: stdout chunk (state=3): >>># zipimport: zlib available<<< 11728 1726882178.74261: stdout chunk (state=3): >>> <<< 11728 1726882178.74325: stdout chunk (state=3): >>> <<< 11728 1726882178.74371: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11728 1726882178.74375: stdout chunk (state=3): >>># destroy __main__ <<< 11728 1726882178.75068: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2<<< 11728 1726882178.75327: stdout chunk (state=3): >>> # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random<<< 11728 1726882178.75331: stdout chunk (state=3): >>> # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath<<< 11728 1726882178.75334: stdout chunk (state=3): >>> # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc<<< 11728 1726882178.75336: stdout chunk (state=3): >>> # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 11728 1726882178.75338: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder<<< 11728 1726882178.75340: stdout chunk (state=3): >>> # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select<<< 11728 1726882178.75342: stdout chunk (state=3): >>> # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize<<< 11728 1726882178.75344: stdout chunk (state=3): >>> # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket<<< 11728 1726882178.75346: stdout chunk (state=3): >>> # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc<<< 11728 1726882178.75637: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11728 1726882178.75692: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11728 1726882178.75742: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 11728 1726882178.75795: stdout chunk (state=3): >>># destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 11728 1726882178.75887: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile<<< 11728 1726882178.76004: stdout chunk (state=3): >>> # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 11728 1726882178.76081: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 11728 1726882178.76146: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections<<< 11728 1726882178.76195: stdout chunk (state=3): >>> # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat<<< 11728 1726882178.76319: stdout chunk (state=3): >>> # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11728 1726882178.76428: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11728 1726882178.76503: stdout chunk (state=3): >>># destroy _collections <<< 11728 1726882178.76510: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11728 1726882178.76556: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11728 1726882178.76701: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11728 1726882178.76705: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections <<< 11728 1726882178.76738: stdout chunk (state=3): >>># destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 11728 1726882178.76853: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11728 1726882178.77313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882178.77500: stderr chunk (state=3): >>><<< 11728 1726882178.77503: stdout chunk (state=3): >>><<< 11728 1726882178.77516: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd26184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd25e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd261aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd23c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd23c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2407e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2407f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd243f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd243ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2405010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd241e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd245ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2494860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2404290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2494d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2494bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2494fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2402db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2495370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24965a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24ae270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd24afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd24af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2496510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd2243bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226c740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226c4a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226c680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226cfe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd226d910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2241d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226da60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2496750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2297080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22bb440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231c230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231e990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd231c350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22e9250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd211d310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd22ba240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd226fc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f8fd211d5b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_jwexsqan/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2172fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2151eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21510a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21712b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219e960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219e480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd2173c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219f710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd219f950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd219fe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b0dc10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b0f800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b101d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b11370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b13e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd21707d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b120f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1be30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1a660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b1abd0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b12600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b63e90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b65c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b65a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b68200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b66330> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6b9e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b683b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6ca70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6cb00> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6cd10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b64350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bf4470> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bf55b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6ec30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1b6ff80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6e810> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1bfd8e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bfed80> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bf5760> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bfede0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bffa10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8fd1a0a180> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1a074d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21f6960> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd21e6660> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1b6dc70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8fd1bff1d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11728 1726882178.78957: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882178.78960: _low_level_execute_command(): starting 11728 1726882178.78962: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882178.1108963-11894-255983093204814/ > /dev/null 2>&1 && sleep 0' 11728 1726882178.79092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.79098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882178.79101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.79103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882178.79105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882178.79107: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882178.79178: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882178.79315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882178.79400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882178.81858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882178.82110: stderr chunk (state=3): >>><<< 11728 1726882178.82114: stdout chunk (state=3): >>><<< 11728 1726882178.82131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882178.82138: handler run complete 11728 1726882178.82159: attempt loop complete, returning result 11728 1726882178.82162: _execute() done 11728 1726882178.82164: dumping result to json 11728 1726882178.82167: done dumping result, returning 11728 1726882178.82176: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [12673a56-9f93-5c28-a762-000000000028] 11728 1726882178.82179: sending task result for task 12673a56-9f93-5c28-a762-000000000028 11728 1726882178.82283: done sending task result for task 12673a56-9f93-5c28-a762-000000000028 11728 1726882178.82288: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11728 1726882178.82380: no more pending results, returning what we have 11728 1726882178.82383: results queue empty 11728 1726882178.82384: checking for any_errors_fatal 11728 1726882178.82389: done checking for any_errors_fatal 11728 1726882178.82390: checking for max_fail_percentage 11728 1726882178.82392: done checking for max_fail_percentage 11728 1726882178.82394: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.82395: done checking to see if all hosts have failed 11728 1726882178.82396: getting the remaining hosts for this loop 11728 1726882178.82405: done getting the remaining hosts for this loop 11728 1726882178.82409: getting the next task for host managed_node3 11728 1726882178.82415: done getting next task for host managed_node3 11728 1726882178.82418: ^ task is: TASK: Set flag to indicate system is ostree 11728 1726882178.82421: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.82425: getting variables 11728 1726882178.82426: in VariableManager get_vars() 11728 1726882178.82457: Calling all_inventory to load vars for managed_node3 11728 1726882178.82460: Calling groups_inventory to load vars for managed_node3 11728 1726882178.82463: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.82474: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.82477: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.82480: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.83111: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.83762: done with get_vars() 11728 1726882178.83775: done getting variables 11728 1726882178.83936: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:29:38 -0400 (0:00:00.798) 0:00:03.693 ****** 11728 1726882178.84083: entering _queue_task() for managed_node3/set_fact 11728 1726882178.84085: Creating lock for set_fact 11728 1726882178.84707: worker is 1 (out of 1 available) 11728 1726882178.84797: exiting _queue_task() for managed_node3/set_fact 11728 1726882178.84810: done queuing things up, now waiting for results queue to drain 11728 1726882178.84811: waiting for pending results... 11728 1726882178.85281: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 11728 1726882178.85301: in run() - task 12673a56-9f93-5c28-a762-000000000029 11728 1726882178.85315: variable 'ansible_search_path' from source: unknown 11728 1726882178.85319: variable 'ansible_search_path' from source: unknown 11728 1726882178.85353: calling self._execute() 11728 1726882178.85626: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.85632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.85642: variable 'omit' from source: magic vars 11728 1726882178.86504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882178.86990: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882178.87255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882178.87286: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882178.87326: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882178.87436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882178.87481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882178.87512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882178.87540: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882178.87652: Evaluated conditional (not __network_is_ostree is defined): True 11728 1726882178.87655: variable 'omit' from source: magic vars 11728 1726882178.87690: variable 'omit' from source: magic vars 11728 1726882178.87981: variable '__ostree_booted_stat' from source: set_fact 11728 1726882178.87985: variable 'omit' from source: magic vars 11728 1726882178.87988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882178.88014: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882178.88031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882178.88046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882178.88056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882178.88110: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882178.88114: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.88116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.88275: Set connection var ansible_connection to ssh 11728 1726882178.88299: Set connection var ansible_shell_executable to /bin/sh 11728 1726882178.88320: Set connection var ansible_timeout to 10 11728 1726882178.88332: Set connection var ansible_shell_type to sh 11728 1726882178.88344: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882178.88354: Set connection var ansible_pipelining to False 11728 1726882178.88382: variable 'ansible_shell_executable' from source: unknown 11728 1726882178.88390: variable 'ansible_connection' from source: unknown 11728 1726882178.88399: variable 'ansible_module_compression' from source: unknown 11728 1726882178.88424: variable 'ansible_shell_type' from source: unknown 11728 1726882178.88427: variable 'ansible_shell_executable' from source: unknown 11728 1726882178.88429: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.88501: variable 'ansible_pipelining' from source: unknown 11728 1726882178.88506: variable 'ansible_timeout' from source: unknown 11728 1726882178.88508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.88574: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882178.88590: variable 'omit' from source: magic vars 11728 1726882178.88604: starting attempt loop 11728 1726882178.88610: running the handler 11728 1726882178.88641: handler run complete 11728 1726882178.88735: attempt loop complete, returning result 11728 1726882178.88738: _execute() done 11728 1726882178.88740: dumping result to json 11728 1726882178.88748: done dumping result, returning 11728 1726882178.88751: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [12673a56-9f93-5c28-a762-000000000029] 11728 1726882178.88753: sending task result for task 12673a56-9f93-5c28-a762-000000000029 11728 1726882178.89012: done sending task result for task 12673a56-9f93-5c28-a762-000000000029 11728 1726882178.89015: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11728 1726882178.89064: no more pending results, returning what we have 11728 1726882178.89066: results queue empty 11728 1726882178.89067: checking for any_errors_fatal 11728 1726882178.89073: done checking for any_errors_fatal 11728 1726882178.89074: checking for max_fail_percentage 11728 1726882178.89075: done checking for max_fail_percentage 11728 1726882178.89076: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.89077: done checking to see if all hosts have failed 11728 1726882178.89077: getting the remaining hosts for this loop 11728 1726882178.89079: done getting the remaining hosts for this loop 11728 1726882178.89081: getting the next task for host managed_node3 11728 1726882178.89089: done getting next task for host managed_node3 11728 1726882178.89092: ^ task is: TASK: Fix CentOS6 Base repo 11728 1726882178.89096: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.89100: getting variables 11728 1726882178.89101: in VariableManager get_vars() 11728 1726882178.89133: Calling all_inventory to load vars for managed_node3 11728 1726882178.89136: Calling groups_inventory to load vars for managed_node3 11728 1726882178.89139: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.89149: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.89152: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.89161: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.89515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.89727: done with get_vars() 11728 1726882178.89746: done getting variables 11728 1726882178.89899: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:29:38 -0400 (0:00:00.058) 0:00:03.751 ****** 11728 1726882178.89925: entering _queue_task() for managed_node3/copy 11728 1726882178.90262: worker is 1 (out of 1 available) 11728 1726882178.90273: exiting _queue_task() for managed_node3/copy 11728 1726882178.90297: done queuing things up, now waiting for results queue to drain 11728 1726882178.90299: waiting for pending results... 11728 1726882178.90528: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 11728 1726882178.90600: in run() - task 12673a56-9f93-5c28-a762-00000000002b 11728 1726882178.90623: variable 'ansible_search_path' from source: unknown 11728 1726882178.90626: variable 'ansible_search_path' from source: unknown 11728 1726882178.90651: calling self._execute() 11728 1726882178.90729: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.90733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.90736: variable 'omit' from source: magic vars 11728 1726882178.91358: variable 'ansible_distribution' from source: facts 11728 1726882178.91386: Evaluated conditional (ansible_distribution == 'CentOS'): True 11728 1726882178.91823: variable 'ansible_distribution_major_version' from source: facts 11728 1726882178.91826: Evaluated conditional (ansible_distribution_major_version == '6'): False 11728 1726882178.91829: when evaluation is False, skipping this task 11728 1726882178.91831: _execute() done 11728 1726882178.91834: dumping result to json 11728 1726882178.91836: done dumping result, returning 11728 1726882178.91838: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [12673a56-9f93-5c28-a762-00000000002b] 11728 1726882178.91841: sending task result for task 12673a56-9f93-5c28-a762-00000000002b 11728 1726882178.91908: done sending task result for task 12673a56-9f93-5c28-a762-00000000002b 11728 1726882178.91912: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11728 1726882178.92005: no more pending results, returning what we have 11728 1726882178.92009: results queue empty 11728 1726882178.92010: checking for any_errors_fatal 11728 1726882178.92015: done checking for any_errors_fatal 11728 1726882178.92015: checking for max_fail_percentage 11728 1726882178.92017: done checking for max_fail_percentage 11728 1726882178.92018: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.92018: done checking to see if all hosts have failed 11728 1726882178.92019: getting the remaining hosts for this loop 11728 1726882178.92021: done getting the remaining hosts for this loop 11728 1726882178.92025: getting the next task for host managed_node3 11728 1726882178.92031: done getting next task for host managed_node3 11728 1726882178.92035: ^ task is: TASK: Include the task 'enable_epel.yml' 11728 1726882178.92038: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.92041: getting variables 11728 1726882178.92042: in VariableManager get_vars() 11728 1726882178.92069: Calling all_inventory to load vars for managed_node3 11728 1726882178.92071: Calling groups_inventory to load vars for managed_node3 11728 1726882178.92074: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.92085: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.92087: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.92090: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.92313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.92506: done with get_vars() 11728 1726882178.92516: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:29:38 -0400 (0:00:00.026) 0:00:03.778 ****** 11728 1726882178.92601: entering _queue_task() for managed_node3/include_tasks 11728 1726882178.92863: worker is 1 (out of 1 available) 11728 1726882178.92877: exiting _queue_task() for managed_node3/include_tasks 11728 1726882178.92888: done queuing things up, now waiting for results queue to drain 11728 1726882178.92889: waiting for pending results... 11728 1726882178.93314: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 11728 1726882178.93320: in run() - task 12673a56-9f93-5c28-a762-00000000002c 11728 1726882178.93323: variable 'ansible_search_path' from source: unknown 11728 1726882178.93325: variable 'ansible_search_path' from source: unknown 11728 1726882178.93328: calling self._execute() 11728 1726882178.93401: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.93423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.93427: variable 'omit' from source: magic vars 11728 1726882178.93814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882178.96056: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882178.96100: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882178.96127: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882178.96165: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882178.96190: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882178.96253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882178.96272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882178.96289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882178.96320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882178.96331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882178.96413: variable '__network_is_ostree' from source: set_fact 11728 1726882178.96427: Evaluated conditional (not __network_is_ostree | d(false)): True 11728 1726882178.96432: _execute() done 11728 1726882178.96435: dumping result to json 11728 1726882178.96437: done dumping result, returning 11728 1726882178.96443: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-5c28-a762-00000000002c] 11728 1726882178.96448: sending task result for task 12673a56-9f93-5c28-a762-00000000002c 11728 1726882178.96532: done sending task result for task 12673a56-9f93-5c28-a762-00000000002c 11728 1726882178.96534: WORKER PROCESS EXITING 11728 1726882178.96558: no more pending results, returning what we have 11728 1726882178.96562: in VariableManager get_vars() 11728 1726882178.96596: Calling all_inventory to load vars for managed_node3 11728 1726882178.96600: Calling groups_inventory to load vars for managed_node3 11728 1726882178.96603: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.96612: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.96614: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.96617: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.96781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.96910: done with get_vars() 11728 1726882178.96919: variable 'ansible_search_path' from source: unknown 11728 1726882178.96920: variable 'ansible_search_path' from source: unknown 11728 1726882178.96944: we have included files to process 11728 1726882178.96944: generating all_blocks data 11728 1726882178.96946: done generating all_blocks data 11728 1726882178.96949: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11728 1726882178.96950: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11728 1726882178.96951: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11728 1726882178.97663: done processing included file 11728 1726882178.97665: iterating over new_blocks loaded from include file 11728 1726882178.97667: in VariableManager get_vars() 11728 1726882178.97679: done with get_vars() 11728 1726882178.97681: filtering new block on tags 11728 1726882178.97704: done filtering new block on tags 11728 1726882178.97707: in VariableManager get_vars() 11728 1726882178.97718: done with get_vars() 11728 1726882178.97719: filtering new block on tags 11728 1726882178.97730: done filtering new block on tags 11728 1726882178.97732: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 11728 1726882178.97737: extending task lists for all hosts with included blocks 11728 1726882178.97843: done extending task lists 11728 1726882178.97844: done processing included files 11728 1726882178.97845: results queue empty 11728 1726882178.97846: checking for any_errors_fatal 11728 1726882178.97848: done checking for any_errors_fatal 11728 1726882178.97849: checking for max_fail_percentage 11728 1726882178.97850: done checking for max_fail_percentage 11728 1726882178.97851: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.97852: done checking to see if all hosts have failed 11728 1726882178.97852: getting the remaining hosts for this loop 11728 1726882178.97863: done getting the remaining hosts for this loop 11728 1726882178.97865: getting the next task for host managed_node3 11728 1726882178.97869: done getting next task for host managed_node3 11728 1726882178.97871: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11728 1726882178.97873: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.97876: getting variables 11728 1726882178.97876: in VariableManager get_vars() 11728 1726882178.97884: Calling all_inventory to load vars for managed_node3 11728 1726882178.97885: Calling groups_inventory to load vars for managed_node3 11728 1726882178.97908: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.97914: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.97920: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.97924: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.98066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.98248: done with get_vars() 11728 1726882178.98254: done getting variables 11728 1726882178.98301: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11728 1726882178.98443: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:29:38 -0400 (0:00:00.058) 0:00:03.837 ****** 11728 1726882178.98474: entering _queue_task() for managed_node3/command 11728 1726882178.98475: Creating lock for command 11728 1726882178.98661: worker is 1 (out of 1 available) 11728 1726882178.98674: exiting _queue_task() for managed_node3/command 11728 1726882178.98685: done queuing things up, now waiting for results queue to drain 11728 1726882178.98687: waiting for pending results... 11728 1726882178.98832: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 11728 1726882178.98901: in run() - task 12673a56-9f93-5c28-a762-000000000046 11728 1726882178.98911: variable 'ansible_search_path' from source: unknown 11728 1726882178.98915: variable 'ansible_search_path' from source: unknown 11728 1726882178.98940: calling self._execute() 11728 1726882178.98989: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882178.98997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882178.99004: variable 'omit' from source: magic vars 11728 1726882178.99253: variable 'ansible_distribution' from source: facts 11728 1726882178.99261: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11728 1726882178.99347: variable 'ansible_distribution_major_version' from source: facts 11728 1726882178.99352: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11728 1726882178.99355: when evaluation is False, skipping this task 11728 1726882178.99358: _execute() done 11728 1726882178.99360: dumping result to json 11728 1726882178.99364: done dumping result, returning 11728 1726882178.99370: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [12673a56-9f93-5c28-a762-000000000046] 11728 1726882178.99375: sending task result for task 12673a56-9f93-5c28-a762-000000000046 11728 1726882178.99460: done sending task result for task 12673a56-9f93-5c28-a762-000000000046 11728 1726882178.99463: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11728 1726882178.99514: no more pending results, returning what we have 11728 1726882178.99517: results queue empty 11728 1726882178.99518: checking for any_errors_fatal 11728 1726882178.99519: done checking for any_errors_fatal 11728 1726882178.99520: checking for max_fail_percentage 11728 1726882178.99521: done checking for max_fail_percentage 11728 1726882178.99521: checking to see if all hosts have failed and the running result is not ok 11728 1726882178.99522: done checking to see if all hosts have failed 11728 1726882178.99523: getting the remaining hosts for this loop 11728 1726882178.99524: done getting the remaining hosts for this loop 11728 1726882178.99526: getting the next task for host managed_node3 11728 1726882178.99530: done getting next task for host managed_node3 11728 1726882178.99532: ^ task is: TASK: Install yum-utils package 11728 1726882178.99535: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882178.99538: getting variables 11728 1726882178.99539: in VariableManager get_vars() 11728 1726882178.99559: Calling all_inventory to load vars for managed_node3 11728 1726882178.99562: Calling groups_inventory to load vars for managed_node3 11728 1726882178.99564: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882178.99572: Calling all_plugins_play to load vars for managed_node3 11728 1726882178.99575: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882178.99577: Calling groups_plugins_play to load vars for managed_node3 11728 1726882178.99824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882178.99973: done with get_vars() 11728 1726882178.99983: done getting variables 11728 1726882179.00068: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:29:39 -0400 (0:00:00.016) 0:00:03.853 ****** 11728 1726882179.00105: entering _queue_task() for managed_node3/package 11728 1726882179.00107: Creating lock for package 11728 1726882179.00336: worker is 1 (out of 1 available) 11728 1726882179.00348: exiting _queue_task() for managed_node3/package 11728 1726882179.00359: done queuing things up, now waiting for results queue to drain 11728 1726882179.00360: waiting for pending results... 11728 1726882179.00615: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 11728 1726882179.00685: in run() - task 12673a56-9f93-5c28-a762-000000000047 11728 1726882179.00697: variable 'ansible_search_path' from source: unknown 11728 1726882179.00700: variable 'ansible_search_path' from source: unknown 11728 1726882179.00800: calling self._execute() 11728 1726882179.00811: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.00824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.00837: variable 'omit' from source: magic vars 11728 1726882179.01240: variable 'ansible_distribution' from source: facts 11728 1726882179.01257: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11728 1726882179.01391: variable 'ansible_distribution_major_version' from source: facts 11728 1726882179.01408: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11728 1726882179.01415: when evaluation is False, skipping this task 11728 1726882179.01422: _execute() done 11728 1726882179.01428: dumping result to json 11728 1726882179.01435: done dumping result, returning 11728 1726882179.01459: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [12673a56-9f93-5c28-a762-000000000047] 11728 1726882179.01470: sending task result for task 12673a56-9f93-5c28-a762-000000000047 11728 1726882179.01643: done sending task result for task 12673a56-9f93-5c28-a762-000000000047 11728 1726882179.01647: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11728 1726882179.01706: no more pending results, returning what we have 11728 1726882179.01709: results queue empty 11728 1726882179.01710: checking for any_errors_fatal 11728 1726882179.01717: done checking for any_errors_fatal 11728 1726882179.01717: checking for max_fail_percentage 11728 1726882179.01719: done checking for max_fail_percentage 11728 1726882179.01719: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.01720: done checking to see if all hosts have failed 11728 1726882179.01721: getting the remaining hosts for this loop 11728 1726882179.01722: done getting the remaining hosts for this loop 11728 1726882179.01725: getting the next task for host managed_node3 11728 1726882179.01731: done getting next task for host managed_node3 11728 1726882179.01733: ^ task is: TASK: Enable EPEL 7 11728 1726882179.01736: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.01740: getting variables 11728 1726882179.01742: in VariableManager get_vars() 11728 1726882179.01768: Calling all_inventory to load vars for managed_node3 11728 1726882179.01771: Calling groups_inventory to load vars for managed_node3 11728 1726882179.01774: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.01782: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.01784: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.01786: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.01950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.02073: done with get_vars() 11728 1726882179.02082: done getting variables 11728 1726882179.02123: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:29:39 -0400 (0:00:00.020) 0:00:03.873 ****** 11728 1726882179.02142: entering _queue_task() for managed_node3/command 11728 1726882179.02315: worker is 1 (out of 1 available) 11728 1726882179.02326: exiting _queue_task() for managed_node3/command 11728 1726882179.02338: done queuing things up, now waiting for results queue to drain 11728 1726882179.02339: waiting for pending results... 11728 1726882179.02476: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 11728 1726882179.02538: in run() - task 12673a56-9f93-5c28-a762-000000000048 11728 1726882179.02548: variable 'ansible_search_path' from source: unknown 11728 1726882179.02552: variable 'ansible_search_path' from source: unknown 11728 1726882179.02581: calling self._execute() 11728 1726882179.02634: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.02638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.02647: variable 'omit' from source: magic vars 11728 1726882179.02897: variable 'ansible_distribution' from source: facts 11728 1726882179.02914: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11728 1726882179.02997: variable 'ansible_distribution_major_version' from source: facts 11728 1726882179.03001: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11728 1726882179.03005: when evaluation is False, skipping this task 11728 1726882179.03008: _execute() done 11728 1726882179.03011: dumping result to json 11728 1726882179.03013: done dumping result, returning 11728 1726882179.03015: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [12673a56-9f93-5c28-a762-000000000048] 11728 1726882179.03028: sending task result for task 12673a56-9f93-5c28-a762-000000000048 11728 1726882179.03099: done sending task result for task 12673a56-9f93-5c28-a762-000000000048 11728 1726882179.03101: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11728 1726882179.03162: no more pending results, returning what we have 11728 1726882179.03165: results queue empty 11728 1726882179.03166: checking for any_errors_fatal 11728 1726882179.03170: done checking for any_errors_fatal 11728 1726882179.03171: checking for max_fail_percentage 11728 1726882179.03172: done checking for max_fail_percentage 11728 1726882179.03173: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.03173: done checking to see if all hosts have failed 11728 1726882179.03174: getting the remaining hosts for this loop 11728 1726882179.03175: done getting the remaining hosts for this loop 11728 1726882179.03178: getting the next task for host managed_node3 11728 1726882179.03182: done getting next task for host managed_node3 11728 1726882179.03184: ^ task is: TASK: Enable EPEL 8 11728 1726882179.03187: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.03190: getting variables 11728 1726882179.03191: in VariableManager get_vars() 11728 1726882179.03216: Calling all_inventory to load vars for managed_node3 11728 1726882179.03219: Calling groups_inventory to load vars for managed_node3 11728 1726882179.03223: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.03230: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.03231: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.03233: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.03364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.03475: done with get_vars() 11728 1726882179.03481: done getting variables 11728 1726882179.03521: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:29:39 -0400 (0:00:00.013) 0:00:03.887 ****** 11728 1726882179.03539: entering _queue_task() for managed_node3/command 11728 1726882179.03725: worker is 1 (out of 1 available) 11728 1726882179.03736: exiting _queue_task() for managed_node3/command 11728 1726882179.03748: done queuing things up, now waiting for results queue to drain 11728 1726882179.03750: waiting for pending results... 11728 1726882179.03980: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 11728 1726882179.04186: in run() - task 12673a56-9f93-5c28-a762-000000000049 11728 1726882179.04190: variable 'ansible_search_path' from source: unknown 11728 1726882179.04197: variable 'ansible_search_path' from source: unknown 11728 1726882179.04201: calling self._execute() 11728 1726882179.04223: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.04233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.04247: variable 'omit' from source: magic vars 11728 1726882179.04612: variable 'ansible_distribution' from source: facts 11728 1726882179.04629: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11728 1726882179.04758: variable 'ansible_distribution_major_version' from source: facts 11728 1726882179.04770: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11728 1726882179.04782: when evaluation is False, skipping this task 11728 1726882179.04786: _execute() done 11728 1726882179.04788: dumping result to json 11728 1726882179.04798: done dumping result, returning 11728 1726882179.04801: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [12673a56-9f93-5c28-a762-000000000049] 11728 1726882179.04805: sending task result for task 12673a56-9f93-5c28-a762-000000000049 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11728 1726882179.04927: no more pending results, returning what we have 11728 1726882179.04930: results queue empty 11728 1726882179.04931: checking for any_errors_fatal 11728 1726882179.04936: done checking for any_errors_fatal 11728 1726882179.04936: checking for max_fail_percentage 11728 1726882179.04938: done checking for max_fail_percentage 11728 1726882179.04938: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.04939: done checking to see if all hosts have failed 11728 1726882179.04940: getting the remaining hosts for this loop 11728 1726882179.04941: done getting the remaining hosts for this loop 11728 1726882179.04944: getting the next task for host managed_node3 11728 1726882179.04950: done getting next task for host managed_node3 11728 1726882179.04952: ^ task is: TASK: Enable EPEL 6 11728 1726882179.04955: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.04958: getting variables 11728 1726882179.04959: in VariableManager get_vars() 11728 1726882179.04983: Calling all_inventory to load vars for managed_node3 11728 1726882179.04986: Calling groups_inventory to load vars for managed_node3 11728 1726882179.04990: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.05001: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.05003: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.05006: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.05114: done sending task result for task 12673a56-9f93-5c28-a762-000000000049 11728 1726882179.05117: WORKER PROCESS EXITING 11728 1726882179.05127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.05242: done with get_vars() 11728 1726882179.05249: done getting variables 11728 1726882179.05285: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:29:39 -0400 (0:00:00.017) 0:00:03.905 ****** 11728 1726882179.05306: entering _queue_task() for managed_node3/copy 11728 1726882179.05460: worker is 1 (out of 1 available) 11728 1726882179.05471: exiting _queue_task() for managed_node3/copy 11728 1726882179.05482: done queuing things up, now waiting for results queue to drain 11728 1726882179.05483: waiting for pending results... 11728 1726882179.05627: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 11728 1726882179.05685: in run() - task 12673a56-9f93-5c28-a762-00000000004b 11728 1726882179.05698: variable 'ansible_search_path' from source: unknown 11728 1726882179.05702: variable 'ansible_search_path' from source: unknown 11728 1726882179.05733: calling self._execute() 11728 1726882179.05784: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.05788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.05797: variable 'omit' from source: magic vars 11728 1726882179.06063: variable 'ansible_distribution' from source: facts 11728 1726882179.06072: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11728 1726882179.06155: variable 'ansible_distribution_major_version' from source: facts 11728 1726882179.06158: Evaluated conditional (ansible_distribution_major_version == '6'): False 11728 1726882179.06161: when evaluation is False, skipping this task 11728 1726882179.06164: _execute() done 11728 1726882179.06167: dumping result to json 11728 1726882179.06169: done dumping result, returning 11728 1726882179.06175: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [12673a56-9f93-5c28-a762-00000000004b] 11728 1726882179.06179: sending task result for task 12673a56-9f93-5c28-a762-00000000004b 11728 1726882179.06261: done sending task result for task 12673a56-9f93-5c28-a762-00000000004b 11728 1726882179.06264: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11728 1726882179.06308: no more pending results, returning what we have 11728 1726882179.06311: results queue empty 11728 1726882179.06312: checking for any_errors_fatal 11728 1726882179.06316: done checking for any_errors_fatal 11728 1726882179.06316: checking for max_fail_percentage 11728 1726882179.06318: done checking for max_fail_percentage 11728 1726882179.06319: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.06320: done checking to see if all hosts have failed 11728 1726882179.06320: getting the remaining hosts for this loop 11728 1726882179.06322: done getting the remaining hosts for this loop 11728 1726882179.06325: getting the next task for host managed_node3 11728 1726882179.06331: done getting next task for host managed_node3 11728 1726882179.06333: ^ task is: TASK: Set network provider to 'nm' 11728 1726882179.06335: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.06338: getting variables 11728 1726882179.06339: in VariableManager get_vars() 11728 1726882179.06360: Calling all_inventory to load vars for managed_node3 11728 1726882179.06362: Calling groups_inventory to load vars for managed_node3 11728 1726882179.06365: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.06373: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.06375: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.06378: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.06506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.06611: done with get_vars() 11728 1726882179.06619: done getting variables 11728 1726882179.06654: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:13 Friday 20 September 2024 21:29:39 -0400 (0:00:00.013) 0:00:03.919 ****** 11728 1726882179.06670: entering _queue_task() for managed_node3/set_fact 11728 1726882179.06826: worker is 1 (out of 1 available) 11728 1726882179.06838: exiting _queue_task() for managed_node3/set_fact 11728 1726882179.06848: done queuing things up, now waiting for results queue to drain 11728 1726882179.06849: waiting for pending results... 11728 1726882179.06972: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 11728 1726882179.07022: in run() - task 12673a56-9f93-5c28-a762-000000000007 11728 1726882179.07032: variable 'ansible_search_path' from source: unknown 11728 1726882179.07056: calling self._execute() 11728 1726882179.07108: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.07112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.07120: variable 'omit' from source: magic vars 11728 1726882179.07184: variable 'omit' from source: magic vars 11728 1726882179.07211: variable 'omit' from source: magic vars 11728 1726882179.07236: variable 'omit' from source: magic vars 11728 1726882179.07264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882179.07291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882179.07313: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882179.07326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882179.07336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882179.07357: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882179.07360: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.07362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.07430: Set connection var ansible_connection to ssh 11728 1726882179.07439: Set connection var ansible_shell_executable to /bin/sh 11728 1726882179.07444: Set connection var ansible_timeout to 10 11728 1726882179.07446: Set connection var ansible_shell_type to sh 11728 1726882179.07452: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882179.07457: Set connection var ansible_pipelining to False 11728 1726882179.07474: variable 'ansible_shell_executable' from source: unknown 11728 1726882179.07476: variable 'ansible_connection' from source: unknown 11728 1726882179.07479: variable 'ansible_module_compression' from source: unknown 11728 1726882179.07481: variable 'ansible_shell_type' from source: unknown 11728 1726882179.07483: variable 'ansible_shell_executable' from source: unknown 11728 1726882179.07486: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.07490: variable 'ansible_pipelining' from source: unknown 11728 1726882179.07492: variable 'ansible_timeout' from source: unknown 11728 1726882179.07500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.07592: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882179.07603: variable 'omit' from source: magic vars 11728 1726882179.07608: starting attempt loop 11728 1726882179.07610: running the handler 11728 1726882179.07620: handler run complete 11728 1726882179.07629: attempt loop complete, returning result 11728 1726882179.07631: _execute() done 11728 1726882179.07634: dumping result to json 11728 1726882179.07636: done dumping result, returning 11728 1726882179.07650: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [12673a56-9f93-5c28-a762-000000000007] 11728 1726882179.07652: sending task result for task 12673a56-9f93-5c28-a762-000000000007 11728 1726882179.07718: done sending task result for task 12673a56-9f93-5c28-a762-000000000007 11728 1726882179.07721: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11728 1726882179.07785: no more pending results, returning what we have 11728 1726882179.07787: results queue empty 11728 1726882179.07788: checking for any_errors_fatal 11728 1726882179.07792: done checking for any_errors_fatal 11728 1726882179.07795: checking for max_fail_percentage 11728 1726882179.07796: done checking for max_fail_percentage 11728 1726882179.07797: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.07798: done checking to see if all hosts have failed 11728 1726882179.07798: getting the remaining hosts for this loop 11728 1726882179.07799: done getting the remaining hosts for this loop 11728 1726882179.07802: getting the next task for host managed_node3 11728 1726882179.07806: done getting next task for host managed_node3 11728 1726882179.07808: ^ task is: TASK: meta (flush_handlers) 11728 1726882179.07809: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.07813: getting variables 11728 1726882179.07814: in VariableManager get_vars() 11728 1726882179.07833: Calling all_inventory to load vars for managed_node3 11728 1726882179.07834: Calling groups_inventory to load vars for managed_node3 11728 1726882179.07836: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.07842: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.07843: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.07845: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.07941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.08050: done with get_vars() 11728 1726882179.08056: done getting variables 11728 1726882179.08098: in VariableManager get_vars() 11728 1726882179.08104: Calling all_inventory to load vars for managed_node3 11728 1726882179.08105: Calling groups_inventory to load vars for managed_node3 11728 1726882179.08107: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.08109: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.08111: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.08112: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.08210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.08318: done with get_vars() 11728 1726882179.08327: done queuing things up, now waiting for results queue to drain 11728 1726882179.08328: results queue empty 11728 1726882179.08328: checking for any_errors_fatal 11728 1726882179.08330: done checking for any_errors_fatal 11728 1726882179.08330: checking for max_fail_percentage 11728 1726882179.08331: done checking for max_fail_percentage 11728 1726882179.08331: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.08331: done checking to see if all hosts have failed 11728 1726882179.08332: getting the remaining hosts for this loop 11728 1726882179.08332: done getting the remaining hosts for this loop 11728 1726882179.08334: getting the next task for host managed_node3 11728 1726882179.08336: done getting next task for host managed_node3 11728 1726882179.08337: ^ task is: TASK: meta (flush_handlers) 11728 1726882179.08337: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.08342: getting variables 11728 1726882179.08343: in VariableManager get_vars() 11728 1726882179.08347: Calling all_inventory to load vars for managed_node3 11728 1726882179.08349: Calling groups_inventory to load vars for managed_node3 11728 1726882179.08350: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.08353: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.08354: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.08356: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.08437: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.08541: done with get_vars() 11728 1726882179.08546: done getting variables 11728 1726882179.08572: in VariableManager get_vars() 11728 1726882179.08577: Calling all_inventory to load vars for managed_node3 11728 1726882179.08579: Calling groups_inventory to load vars for managed_node3 11728 1726882179.08581: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.08585: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.08587: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.08589: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.08680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.08783: done with get_vars() 11728 1726882179.08790: done queuing things up, now waiting for results queue to drain 11728 1726882179.08791: results queue empty 11728 1726882179.08792: checking for any_errors_fatal 11728 1726882179.08795: done checking for any_errors_fatal 11728 1726882179.08796: checking for max_fail_percentage 11728 1726882179.08796: done checking for max_fail_percentage 11728 1726882179.08797: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.08798: done checking to see if all hosts have failed 11728 1726882179.08798: getting the remaining hosts for this loop 11728 1726882179.08799: done getting the remaining hosts for this loop 11728 1726882179.08801: getting the next task for host managed_node3 11728 1726882179.08803: done getting next task for host managed_node3 11728 1726882179.08804: ^ task is: None 11728 1726882179.08805: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.08806: done queuing things up, now waiting for results queue to drain 11728 1726882179.08807: results queue empty 11728 1726882179.08807: checking for any_errors_fatal 11728 1726882179.08808: done checking for any_errors_fatal 11728 1726882179.08808: checking for max_fail_percentage 11728 1726882179.08809: done checking for max_fail_percentage 11728 1726882179.08809: checking to see if all hosts have failed and the running result is not ok 11728 1726882179.08809: done checking to see if all hosts have failed 11728 1726882179.08811: getting the next task for host managed_node3 11728 1726882179.08812: done getting next task for host managed_node3 11728 1726882179.08813: ^ task is: None 11728 1726882179.08813: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.08846: in VariableManager get_vars() 11728 1726882179.08855: done with get_vars() 11728 1726882179.08859: in VariableManager get_vars() 11728 1726882179.08864: done with get_vars() 11728 1726882179.08867: variable 'omit' from source: magic vars 11728 1726882179.08885: in VariableManager get_vars() 11728 1726882179.08891: done with get_vars() 11728 1726882179.08907: variable 'omit' from source: magic vars PLAY [Play for testing bond options] ******************************************* 11728 1726882179.09055: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11728 1726882179.09074: getting the remaining hosts for this loop 11728 1726882179.09075: done getting the remaining hosts for this loop 11728 1726882179.09077: getting the next task for host managed_node3 11728 1726882179.09078: done getting next task for host managed_node3 11728 1726882179.09079: ^ task is: TASK: Gathering Facts 11728 1726882179.09080: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882179.09081: getting variables 11728 1726882179.09082: in VariableManager get_vars() 11728 1726882179.09087: Calling all_inventory to load vars for managed_node3 11728 1726882179.09088: Calling groups_inventory to load vars for managed_node3 11728 1726882179.09090: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882179.09095: Calling all_plugins_play to load vars for managed_node3 11728 1726882179.09104: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882179.09106: Calling groups_plugins_play to load vars for managed_node3 11728 1726882179.09185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882179.09298: done with get_vars() 11728 1726882179.09304: done getting variables 11728 1726882179.09327: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 Friday 20 September 2024 21:29:39 -0400 (0:00:00.026) 0:00:03.945 ****** 11728 1726882179.09341: entering _queue_task() for managed_node3/gather_facts 11728 1726882179.09482: worker is 1 (out of 1 available) 11728 1726882179.09496: exiting _queue_task() for managed_node3/gather_facts 11728 1726882179.09507: done queuing things up, now waiting for results queue to drain 11728 1726882179.09508: waiting for pending results... 11728 1726882179.09641: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11728 1726882179.09684: in run() - task 12673a56-9f93-5c28-a762-000000000071 11728 1726882179.09699: variable 'ansible_search_path' from source: unknown 11728 1726882179.09723: calling self._execute() 11728 1726882179.09772: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.09776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.09783: variable 'omit' from source: magic vars 11728 1726882179.10099: variable 'ansible_distribution_major_version' from source: facts 11728 1726882179.10106: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882179.10112: variable 'omit' from source: magic vars 11728 1726882179.10128: variable 'omit' from source: magic vars 11728 1726882179.10150: variable 'omit' from source: magic vars 11728 1726882179.10179: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882179.10208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882179.10222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882179.10235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882179.10245: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882179.10265: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882179.10268: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.10270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.10336: Set connection var ansible_connection to ssh 11728 1726882179.10344: Set connection var ansible_shell_executable to /bin/sh 11728 1726882179.10349: Set connection var ansible_timeout to 10 11728 1726882179.10351: Set connection var ansible_shell_type to sh 11728 1726882179.10358: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882179.10362: Set connection var ansible_pipelining to False 11728 1726882179.10378: variable 'ansible_shell_executable' from source: unknown 11728 1726882179.10383: variable 'ansible_connection' from source: unknown 11728 1726882179.10386: variable 'ansible_module_compression' from source: unknown 11728 1726882179.10388: variable 'ansible_shell_type' from source: unknown 11728 1726882179.10391: variable 'ansible_shell_executable' from source: unknown 11728 1726882179.10402: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882179.10412: variable 'ansible_pipelining' from source: unknown 11728 1726882179.10415: variable 'ansible_timeout' from source: unknown 11728 1726882179.10417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882179.10530: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882179.10538: variable 'omit' from source: magic vars 11728 1726882179.10542: starting attempt loop 11728 1726882179.10545: running the handler 11728 1726882179.10556: variable 'ansible_facts' from source: unknown 11728 1726882179.10571: _low_level_execute_command(): starting 11728 1726882179.10578: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882179.11061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.11103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882179.11105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882179.11108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.11147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882179.11154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882179.11166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882179.11233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882179.13564: stdout chunk (state=3): >>>/root <<< 11728 1726882179.13710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882179.13736: stderr chunk (state=3): >>><<< 11728 1726882179.13739: stdout chunk (state=3): >>><<< 11728 1726882179.13757: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882179.13768: _low_level_execute_command(): starting 11728 1726882179.13778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660 `" && echo ansible-tmp-1726882179.1375687-11961-263547878847660="` echo /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660 `" ) && sleep 0' 11728 1726882179.14212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.14215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882179.14226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.14229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.14231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.14274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882179.14278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882179.14283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882179.14335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882179.16912: stdout chunk (state=3): >>>ansible-tmp-1726882179.1375687-11961-263547878847660=/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660 <<< 11728 1726882179.17064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882179.17087: stderr chunk (state=3): >>><<< 11728 1726882179.17090: stdout chunk (state=3): >>><<< 11728 1726882179.17105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882179.1375687-11961-263547878847660=/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882179.17128: variable 'ansible_module_compression' from source: unknown 11728 1726882179.17170: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11728 1726882179.17218: variable 'ansible_facts' from source: unknown 11728 1726882179.17355: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py 11728 1726882179.17456: Sending initial data 11728 1726882179.17459: Sent initial data (154 bytes) 11728 1726882179.17881: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.17884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.17896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.17954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882179.17958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882179.18014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882179.19919: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882179.19925: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882179.19964: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882179.20097: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmphtqw4idu /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py <<< 11728 1726882179.20103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py" <<< 11728 1726882179.20150: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmphtqw4idu" to remote "/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py" <<< 11728 1726882179.20153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py" <<< 11728 1726882179.21230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882179.21270: stderr chunk (state=3): >>><<< 11728 1726882179.21274: stdout chunk (state=3): >>><<< 11728 1726882179.21297: done transferring module to remote 11728 1726882179.21307: _low_level_execute_command(): starting 11728 1726882179.21312: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/ /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py && sleep 0' 11728 1726882179.21755: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.21758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.21760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882179.21766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882179.21814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882179.21818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882179.21875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882179.23835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882179.23839: stdout chunk (state=3): >>><<< 11728 1726882179.23841: stderr chunk (state=3): >>><<< 11728 1726882179.23844: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882179.23846: _low_level_execute_command(): starting 11728 1726882179.23848: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/AnsiballZ_setup.py && sleep 0' 11728 1726882179.24491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882179.24658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882179.24679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882179.24743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882179.24804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882179.98250: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.4423828125, "5m": 0.23583984375, "15m": 0.1240234375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "39", "epoch": "1726882179", "epoch_int": "1726882179", "date": "2024-09-20", "time": "21:29:39", "iso8601_micro": "2024-09-21T01:29:39.645603Z", "iso8601": "2024-09-21T01:29:39Z", "iso8601_basic": "20240920T212939645603", "iso8601_basic_short": "20240920T212939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "L<<< 11728 1726882179.98277: stdout chunk (state=3): >>>ESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "pref<<< 11728 1726882179.98288: stdout chunk (state=3): >>>ix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product<<< 11728 1726882179.98311: stdout chunk (state=3): >>>_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805490176, "block_size": 4096, "block_total": 65519099, "block_available": 63917356, "block_used": 1601743, "inode_total": 131070960, "inode_available": 131029137, "inode_used": 41823, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11728 1726882180.00270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882180.00300: stderr chunk (state=3): >>><<< 11728 1726882180.00304: stdout chunk (state=3): >>><<< 11728 1726882180.00332: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCv7uM8iExTeI4wsxGEirDCIB5rfuashDyqixAMrsgojV44m9e49NO3hj7ILsTTBL2CHnfLuLE1/PLpq7UY8Z1Z8ro+SmmXu++VXRqryH5co2uqHva7V6sHb6D0w7V9QhBLpdZFYEoP0DS5gVD9JQFynOilgl8wt/jWccIG1lWZi9pozQdP7A/myzjixT/sJ/dwyz8xvTWJg8mm1MsbYn2WTH8iil55RGt5+Srq66y14fY2WfYG2fpZAu2FUQP08MxFIAzAetJatr6cWpPKpSpFt3GxBUw9mZMYCqrmgqwBD/PAtXD6Q7x/7qAtiiHsfMBTZienaA1mW1aNHB5lYinW+yIEPJsEXOfVQXD7Grje437Hq7ilY2Ls8shFo/H1kZ7MVesrrJ0x/2SBU9GvKJMaweWKcsmmll+jNBUuGX6ts04Vmsca92EMTJvbEZ5S0c4wSIE0d0Abf1Xqh6e9aP6EWDz6EY13coJ8t20q68K2L8C+7SV2ymAL1nKR36KDmUU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK8+EpkEsEK0/7/tF+Ot2JevPtJYRlnBvekg0Ue9FRv3lrN7bw8W95KfTN9YYbHxSXwfmPM7CC79pp6v7bDk8dE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFW1A+ae3pfP8rgVu0EA2QvBQu2xPGiaOdV7VpH2SdJ3", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-10-229", "ansible_nodename": "ip-10-31-10-229.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec23ea4468ccc875d6f6db60ff64318a", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.4423828125, "5m": 0.23583984375, "15m": 0.1240234375}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "39", "epoch": "1726882179", "epoch_int": "1726882179", "date": "2024-09-20", "time": "21:29:39", "iso8601_micro": "2024-09-21T01:29:39.645603Z", "iso8601": "2024-09-21T01:29:39Z", "iso8601_basic": "20240920T212939645603", "iso8601_basic_short": "20240920T212939", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 53716 10.31.10.229 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 53716 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1087:27ff:fe91:8737", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.229", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:87:27:91:87:37", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.229"], "ansible_all_ipv6_addresses": ["fe80::1087:27ff:fe91:8737"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.229", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1087:27ff:fe91:8737"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3272, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_uuid": "ec23ea44-68cc-c875-d6f6-db60ff64318a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 486, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805490176, "block_size": 4096, "block_total": 65519099, "block_available": 63917356, "block_used": 1601743, "inode_total": 131070960, "inode_available": 131029137, "inode_used": 41823, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882180.00528: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882180.00545: _low_level_execute_command(): starting 11728 1726882180.00549: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882179.1375687-11961-263547878847660/ > /dev/null 2>&1 && sleep 0' 11728 1726882180.00981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882180.00985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.00987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882180.00990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882180.00992: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.01044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.01047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.01097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.02976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.02980: stdout chunk (state=3): >>><<< 11728 1726882180.02982: stderr chunk (state=3): >>><<< 11728 1726882180.03199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882180.03203: handler run complete 11728 1726882180.03205: variable 'ansible_facts' from source: unknown 11728 1726882180.03237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.03579: variable 'ansible_facts' from source: unknown 11728 1726882180.03668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.03812: attempt loop complete, returning result 11728 1726882180.03821: _execute() done 11728 1726882180.03827: dumping result to json 11728 1726882180.03866: done dumping result, returning 11728 1726882180.03877: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [12673a56-9f93-5c28-a762-000000000071] 11728 1726882180.03885: sending task result for task 12673a56-9f93-5c28-a762-000000000071 ok: [managed_node3] 11728 1726882180.04849: no more pending results, returning what we have 11728 1726882180.04852: results queue empty 11728 1726882180.04853: checking for any_errors_fatal 11728 1726882180.04854: done checking for any_errors_fatal 11728 1726882180.04855: checking for max_fail_percentage 11728 1726882180.04857: done checking for max_fail_percentage 11728 1726882180.04858: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.04858: done checking to see if all hosts have failed 11728 1726882180.04859: getting the remaining hosts for this loop 11728 1726882180.04860: done getting the remaining hosts for this loop 11728 1726882180.04864: getting the next task for host managed_node3 11728 1726882180.04869: done getting next task for host managed_node3 11728 1726882180.04870: ^ task is: TASK: meta (flush_handlers) 11728 1726882180.04873: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.04876: getting variables 11728 1726882180.04877: in VariableManager get_vars() 11728 1726882180.04961: Calling all_inventory to load vars for managed_node3 11728 1726882180.04964: Calling groups_inventory to load vars for managed_node3 11728 1726882180.04968: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.04974: done sending task result for task 12673a56-9f93-5c28-a762-000000000071 11728 1726882180.04977: WORKER PROCESS EXITING 11728 1726882180.04987: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.04990: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.04995: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.05211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.05377: done with get_vars() 11728 1726882180.05391: done getting variables 11728 1726882180.05454: in VariableManager get_vars() 11728 1726882180.05463: Calling all_inventory to load vars for managed_node3 11728 1726882180.05465: Calling groups_inventory to load vars for managed_node3 11728 1726882180.05467: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.05471: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.05473: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.05475: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.05599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.05783: done with get_vars() 11728 1726882180.05800: done queuing things up, now waiting for results queue to drain 11728 1726882180.05802: results queue empty 11728 1726882180.05803: checking for any_errors_fatal 11728 1726882180.05806: done checking for any_errors_fatal 11728 1726882180.05807: checking for max_fail_percentage 11728 1726882180.05813: done checking for max_fail_percentage 11728 1726882180.05814: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.05814: done checking to see if all hosts have failed 11728 1726882180.05815: getting the remaining hosts for this loop 11728 1726882180.05816: done getting the remaining hosts for this loop 11728 1726882180.05818: getting the next task for host managed_node3 11728 1726882180.05832: done getting next task for host managed_node3 11728 1726882180.05835: ^ task is: TASK: Show playbook name 11728 1726882180.05836: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.05838: getting variables 11728 1726882180.05839: in VariableManager get_vars() 11728 1726882180.05846: Calling all_inventory to load vars for managed_node3 11728 1726882180.05848: Calling groups_inventory to load vars for managed_node3 11728 1726882180.05850: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.05855: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.05858: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.05860: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.05997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.06171: done with get_vars() 11728 1726882180.06179: done getting variables 11728 1726882180.06252: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show playbook name] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:32 Friday 20 September 2024 21:29:40 -0400 (0:00:00.969) 0:00:04.915 ****** 11728 1726882180.06281: entering _queue_task() for managed_node3/debug 11728 1726882180.06283: Creating lock for debug 11728 1726882180.06622: worker is 1 (out of 1 available) 11728 1726882180.06633: exiting _queue_task() for managed_node3/debug 11728 1726882180.06643: done queuing things up, now waiting for results queue to drain 11728 1726882180.06644: waiting for pending results... 11728 1726882180.06831: running TaskExecutor() for managed_node3/TASK: Show playbook name 11728 1726882180.06924: in run() - task 12673a56-9f93-5c28-a762-00000000000b 11728 1726882180.06943: variable 'ansible_search_path' from source: unknown 11728 1726882180.06980: calling self._execute() 11728 1726882180.07061: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.07072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.07085: variable 'omit' from source: magic vars 11728 1726882180.07462: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.07483: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.07498: variable 'omit' from source: magic vars 11728 1726882180.07528: variable 'omit' from source: magic vars 11728 1726882180.07566: variable 'omit' from source: magic vars 11728 1726882180.07624: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.07663: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.07695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.07792: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.07798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.07801: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.07804: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.07806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.07916: Set connection var ansible_connection to ssh 11728 1726882180.07936: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.07947: Set connection var ansible_timeout to 10 11728 1726882180.07953: Set connection var ansible_shell_type to sh 11728 1726882180.07965: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.07975: Set connection var ansible_pipelining to False 11728 1726882180.08009: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.08018: variable 'ansible_connection' from source: unknown 11728 1726882180.08028: variable 'ansible_module_compression' from source: unknown 11728 1726882180.08042: variable 'ansible_shell_type' from source: unknown 11728 1726882180.08120: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.08123: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.08126: variable 'ansible_pipelining' from source: unknown 11728 1726882180.08128: variable 'ansible_timeout' from source: unknown 11728 1726882180.08130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.08246: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.08269: variable 'omit' from source: magic vars 11728 1726882180.08281: starting attempt loop 11728 1726882180.08291: running the handler 11728 1726882180.08354: handler run complete 11728 1726882180.08389: attempt loop complete, returning result 11728 1726882180.08400: _execute() done 11728 1726882180.08409: dumping result to json 11728 1726882180.08417: done dumping result, returning 11728 1726882180.08448: done running TaskExecutor() for managed_node3/TASK: Show playbook name [12673a56-9f93-5c28-a762-00000000000b] 11728 1726882180.08452: sending task result for task 12673a56-9f93-5c28-a762-00000000000b ok: [managed_node3] => {} MSG: this is: playbooks/tests_bond_options.yml 11728 1726882180.08641: no more pending results, returning what we have 11728 1726882180.08645: results queue empty 11728 1726882180.08646: checking for any_errors_fatal 11728 1726882180.08648: done checking for any_errors_fatal 11728 1726882180.08648: checking for max_fail_percentage 11728 1726882180.08650: done checking for max_fail_percentage 11728 1726882180.08650: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.08651: done checking to see if all hosts have failed 11728 1726882180.08652: getting the remaining hosts for this loop 11728 1726882180.08654: done getting the remaining hosts for this loop 11728 1726882180.08657: getting the next task for host managed_node3 11728 1726882180.08666: done getting next task for host managed_node3 11728 1726882180.08669: ^ task is: TASK: Include the task 'run_test.yml' 11728 1726882180.08671: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.08675: getting variables 11728 1726882180.08676: in VariableManager get_vars() 11728 1726882180.08706: Calling all_inventory to load vars for managed_node3 11728 1726882180.08708: Calling groups_inventory to load vars for managed_node3 11728 1726882180.08712: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.08722: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.08725: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.08727: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.09153: done sending task result for task 12673a56-9f93-5c28-a762-00000000000b 11728 1726882180.09156: WORKER PROCESS EXITING 11728 1726882180.09178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.09380: done with get_vars() 11728 1726882180.09389: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:42 Friday 20 September 2024 21:29:40 -0400 (0:00:00.031) 0:00:04.947 ****** 11728 1726882180.09465: entering _queue_task() for managed_node3/include_tasks 11728 1726882180.09718: worker is 1 (out of 1 available) 11728 1726882180.09729: exiting _queue_task() for managed_node3/include_tasks 11728 1726882180.09741: done queuing things up, now waiting for results queue to drain 11728 1726882180.09743: waiting for pending results... 11728 1726882180.10072: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 11728 1726882180.10251: in run() - task 12673a56-9f93-5c28-a762-00000000000d 11728 1726882180.10263: variable 'ansible_search_path' from source: unknown 11728 1726882180.10303: calling self._execute() 11728 1726882180.10432: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.10443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.10464: variable 'omit' from source: magic vars 11728 1726882180.10890: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.10914: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.10925: _execute() done 11728 1726882180.10946: dumping result to json 11728 1726882180.10954: done dumping result, returning 11728 1726882180.10974: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [12673a56-9f93-5c28-a762-00000000000d] 11728 1726882180.10977: sending task result for task 12673a56-9f93-5c28-a762-00000000000d 11728 1726882180.11128: no more pending results, returning what we have 11728 1726882180.11133: in VariableManager get_vars() 11728 1726882180.11165: Calling all_inventory to load vars for managed_node3 11728 1726882180.11168: Calling groups_inventory to load vars for managed_node3 11728 1726882180.11171: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.11183: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.11297: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.11303: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.11567: done sending task result for task 12673a56-9f93-5c28-a762-00000000000d 11728 1726882180.11570: WORKER PROCESS EXITING 11728 1726882180.11595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.11944: done with get_vars() 11728 1726882180.11951: variable 'ansible_search_path' from source: unknown 11728 1726882180.11962: we have included files to process 11728 1726882180.11963: generating all_blocks data 11728 1726882180.11964: done generating all_blocks data 11728 1726882180.11965: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882180.11966: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882180.11968: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882180.12503: in VariableManager get_vars() 11728 1726882180.12518: done with get_vars() 11728 1726882180.12557: in VariableManager get_vars() 11728 1726882180.12571: done with get_vars() 11728 1726882180.12615: in VariableManager get_vars() 11728 1726882180.12629: done with get_vars() 11728 1726882180.12666: in VariableManager get_vars() 11728 1726882180.12680: done with get_vars() 11728 1726882180.12735: in VariableManager get_vars() 11728 1726882180.12750: done with get_vars() 11728 1726882180.13099: in VariableManager get_vars() 11728 1726882180.13117: done with get_vars() 11728 1726882180.13132: done processing included file 11728 1726882180.13133: iterating over new_blocks loaded from include file 11728 1726882180.13135: in VariableManager get_vars() 11728 1726882180.13145: done with get_vars() 11728 1726882180.13146: filtering new block on tags 11728 1726882180.13230: done filtering new block on tags 11728 1726882180.13232: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 11728 1726882180.13240: extending task lists for all hosts with included blocks 11728 1726882180.13269: done extending task lists 11728 1726882180.13270: done processing included files 11728 1726882180.13270: results queue empty 11728 1726882180.13271: checking for any_errors_fatal 11728 1726882180.13275: done checking for any_errors_fatal 11728 1726882180.13275: checking for max_fail_percentage 11728 1726882180.13276: done checking for max_fail_percentage 11728 1726882180.13277: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.13278: done checking to see if all hosts have failed 11728 1726882180.13278: getting the remaining hosts for this loop 11728 1726882180.13279: done getting the remaining hosts for this loop 11728 1726882180.13281: getting the next task for host managed_node3 11728 1726882180.13284: done getting next task for host managed_node3 11728 1726882180.13286: ^ task is: TASK: TEST: {{ lsr_description }} 11728 1726882180.13288: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.13289: getting variables 11728 1726882180.13290: in VariableManager get_vars() 11728 1726882180.13303: Calling all_inventory to load vars for managed_node3 11728 1726882180.13305: Calling groups_inventory to load vars for managed_node3 11728 1726882180.13308: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.13313: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.13315: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.13318: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.13481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.13657: done with get_vars() 11728 1726882180.13665: done getting variables 11728 1726882180.13708: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882180.13825: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:29:40 -0400 (0:00:00.043) 0:00:04.991 ****** 11728 1726882180.13861: entering _queue_task() for managed_node3/debug 11728 1726882180.14304: worker is 1 (out of 1 available) 11728 1726882180.14312: exiting _queue_task() for managed_node3/debug 11728 1726882180.14319: done queuing things up, now waiting for results queue to drain 11728 1726882180.14320: waiting for pending results... 11728 1726882180.14371: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11728 1726882180.14473: in run() - task 12673a56-9f93-5c28-a762-000000000088 11728 1726882180.14489: variable 'ansible_search_path' from source: unknown 11728 1726882180.14499: variable 'ansible_search_path' from source: unknown 11728 1726882180.14534: calling self._execute() 11728 1726882180.14614: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.14624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.14636: variable 'omit' from source: magic vars 11728 1726882180.14977: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.14999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.15010: variable 'omit' from source: magic vars 11728 1726882180.15046: variable 'omit' from source: magic vars 11728 1726882180.15201: variable 'lsr_description' from source: include params 11728 1726882180.15205: variable 'omit' from source: magic vars 11728 1726882180.15216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.15254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.15280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.15311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.15330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.15362: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.15371: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.15378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.15482: Set connection var ansible_connection to ssh 11728 1726882180.15499: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.15508: Set connection var ansible_timeout to 10 11728 1726882180.15514: Set connection var ansible_shell_type to sh 11728 1726882180.15529: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.15537: Set connection var ansible_pipelining to False 11728 1726882180.15558: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.15564: variable 'ansible_connection' from source: unknown 11728 1726882180.15569: variable 'ansible_module_compression' from source: unknown 11728 1726882180.15574: variable 'ansible_shell_type' from source: unknown 11728 1726882180.15578: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.15583: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.15588: variable 'ansible_pipelining' from source: unknown 11728 1726882180.15595: variable 'ansible_timeout' from source: unknown 11728 1726882180.15602: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.15803: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.15806: variable 'omit' from source: magic vars 11728 1726882180.15809: starting attempt loop 11728 1726882180.15811: running the handler 11728 1726882180.15820: handler run complete 11728 1726882180.15839: attempt loop complete, returning result 11728 1726882180.15846: _execute() done 11728 1726882180.15853: dumping result to json 11728 1726882180.15861: done dumping result, returning 11728 1726882180.15876: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [12673a56-9f93-5c28-a762-000000000088] 11728 1726882180.15885: sending task result for task 12673a56-9f93-5c28-a762-000000000088 ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11728 1726882180.16031: no more pending results, returning what we have 11728 1726882180.16035: results queue empty 11728 1726882180.16036: checking for any_errors_fatal 11728 1726882180.16037: done checking for any_errors_fatal 11728 1726882180.16037: checking for max_fail_percentage 11728 1726882180.16039: done checking for max_fail_percentage 11728 1726882180.16040: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.16041: done checking to see if all hosts have failed 11728 1726882180.16041: getting the remaining hosts for this loop 11728 1726882180.16043: done getting the remaining hosts for this loop 11728 1726882180.16046: getting the next task for host managed_node3 11728 1726882180.16051: done getting next task for host managed_node3 11728 1726882180.16053: ^ task is: TASK: Show item 11728 1726882180.16056: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.16059: getting variables 11728 1726882180.16061: in VariableManager get_vars() 11728 1726882180.16090: Calling all_inventory to load vars for managed_node3 11728 1726882180.16095: Calling groups_inventory to load vars for managed_node3 11728 1726882180.16276: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.16283: done sending task result for task 12673a56-9f93-5c28-a762-000000000088 11728 1726882180.16286: WORKER PROCESS EXITING 11728 1726882180.16299: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.16302: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.16306: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.16515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.16716: done with get_vars() 11728 1726882180.16725: done getting variables 11728 1726882180.16784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:29:40 -0400 (0:00:00.029) 0:00:05.020 ****** 11728 1726882180.16820: entering _queue_task() for managed_node3/debug 11728 1726882180.17106: worker is 1 (out of 1 available) 11728 1726882180.17119: exiting _queue_task() for managed_node3/debug 11728 1726882180.17131: done queuing things up, now waiting for results queue to drain 11728 1726882180.17133: waiting for pending results... 11728 1726882180.17338: running TaskExecutor() for managed_node3/TASK: Show item 11728 1726882180.17429: in run() - task 12673a56-9f93-5c28-a762-000000000089 11728 1726882180.17448: variable 'ansible_search_path' from source: unknown 11728 1726882180.17456: variable 'ansible_search_path' from source: unknown 11728 1726882180.17513: variable 'omit' from source: magic vars 11728 1726882180.17640: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.17659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.17698: variable 'omit' from source: magic vars 11728 1726882180.18352: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.18369: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.18408: variable 'omit' from source: magic vars 11728 1726882180.18424: variable 'omit' from source: magic vars 11728 1726882180.18474: variable 'item' from source: unknown 11728 1726882180.18562: variable 'item' from source: unknown 11728 1726882180.18603: variable 'omit' from source: magic vars 11728 1726882180.18638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.18699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.18707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.18731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.18765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.18802: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.18805: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.18809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.18910: Set connection var ansible_connection to ssh 11728 1726882180.18919: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.18924: Set connection var ansible_timeout to 10 11728 1726882180.18926: Set connection var ansible_shell_type to sh 11728 1726882180.18933: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.18937: Set connection var ansible_pipelining to False 11728 1726882180.18952: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.18955: variable 'ansible_connection' from source: unknown 11728 1726882180.18958: variable 'ansible_module_compression' from source: unknown 11728 1726882180.18960: variable 'ansible_shell_type' from source: unknown 11728 1726882180.18962: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.18965: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.18968: variable 'ansible_pipelining' from source: unknown 11728 1726882180.18971: variable 'ansible_timeout' from source: unknown 11728 1726882180.18975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.19072: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.19081: variable 'omit' from source: magic vars 11728 1726882180.19085: starting attempt loop 11728 1726882180.19088: running the handler 11728 1726882180.19125: variable 'lsr_description' from source: include params 11728 1726882180.19169: variable 'lsr_description' from source: include params 11728 1726882180.19176: handler run complete 11728 1726882180.19189: attempt loop complete, returning result 11728 1726882180.19203: variable 'item' from source: unknown 11728 1726882180.19249: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11728 1726882180.19377: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.19380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.19382: variable 'omit' from source: magic vars 11728 1726882180.19451: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.19454: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.19459: variable 'omit' from source: magic vars 11728 1726882180.19470: variable 'omit' from source: magic vars 11728 1726882180.19501: variable 'item' from source: unknown 11728 1726882180.19540: variable 'item' from source: unknown 11728 1726882180.19551: variable 'omit' from source: magic vars 11728 1726882180.19565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.19572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.19578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.19587: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.19590: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.19592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.19641: Set connection var ansible_connection to ssh 11728 1726882180.19647: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.19652: Set connection var ansible_timeout to 10 11728 1726882180.19654: Set connection var ansible_shell_type to sh 11728 1726882180.19661: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.19665: Set connection var ansible_pipelining to False 11728 1726882180.19680: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.19682: variable 'ansible_connection' from source: unknown 11728 1726882180.19685: variable 'ansible_module_compression' from source: unknown 11728 1726882180.19688: variable 'ansible_shell_type' from source: unknown 11728 1726882180.19690: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.19692: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.19698: variable 'ansible_pipelining' from source: unknown 11728 1726882180.19700: variable 'ansible_timeout' from source: unknown 11728 1726882180.19703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.19759: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.19767: variable 'omit' from source: magic vars 11728 1726882180.19770: starting attempt loop 11728 1726882180.19772: running the handler 11728 1726882180.19788: variable 'lsr_setup' from source: include params 11728 1726882180.19837: variable 'lsr_setup' from source: include params 11728 1726882180.19869: handler run complete 11728 1726882180.19881: attempt loop complete, returning result 11728 1726882180.19896: variable 'item' from source: unknown 11728 1726882180.20098: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11728 1726882180.20163: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.20166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.20169: variable 'omit' from source: magic vars 11728 1726882180.20227: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.20236: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.20243: variable 'omit' from source: magic vars 11728 1726882180.20257: variable 'omit' from source: magic vars 11728 1726882180.20296: variable 'item' from source: unknown 11728 1726882180.20353: variable 'item' from source: unknown 11728 1726882180.20370: variable 'omit' from source: magic vars 11728 1726882180.20388: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.20402: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.20410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.20423: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.20430: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.20441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.20500: Set connection var ansible_connection to ssh 11728 1726882180.20521: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.20536: Set connection var ansible_timeout to 10 11728 1726882180.20598: Set connection var ansible_shell_type to sh 11728 1726882180.20797: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.20800: Set connection var ansible_pipelining to False 11728 1726882180.20802: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.20804: variable 'ansible_connection' from source: unknown 11728 1726882180.20805: variable 'ansible_module_compression' from source: unknown 11728 1726882180.20807: variable 'ansible_shell_type' from source: unknown 11728 1726882180.20809: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.20810: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.20812: variable 'ansible_pipelining' from source: unknown 11728 1726882180.20813: variable 'ansible_timeout' from source: unknown 11728 1726882180.20815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.20998: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.21001: variable 'omit' from source: magic vars 11728 1726882180.21003: starting attempt loop 11728 1726882180.21005: running the handler 11728 1726882180.21007: variable 'lsr_test' from source: include params 11728 1726882180.21008: variable 'lsr_test' from source: include params 11728 1726882180.21010: handler run complete 11728 1726882180.21012: attempt loop complete, returning result 11728 1726882180.21034: variable 'item' from source: unknown 11728 1726882180.21090: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile.yml" ] } 11728 1726882180.21222: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.21233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.21244: variable 'omit' from source: magic vars 11728 1726882180.21376: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.21386: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.21395: variable 'omit' from source: magic vars 11728 1726882180.21499: variable 'omit' from source: magic vars 11728 1726882180.21506: variable 'item' from source: unknown 11728 1726882180.21508: variable 'item' from source: unknown 11728 1726882180.21525: variable 'omit' from source: magic vars 11728 1726882180.21543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.21552: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.21559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.21571: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.21576: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.21581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.21647: Set connection var ansible_connection to ssh 11728 1726882180.21659: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.21667: Set connection var ansible_timeout to 10 11728 1726882180.21673: Set connection var ansible_shell_type to sh 11728 1726882180.21683: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.21690: Set connection var ansible_pipelining to False 11728 1726882180.21714: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.21726: variable 'ansible_connection' from source: unknown 11728 1726882180.21733: variable 'ansible_module_compression' from source: unknown 11728 1726882180.21739: variable 'ansible_shell_type' from source: unknown 11728 1726882180.21744: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.21749: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.21754: variable 'ansible_pipelining' from source: unknown 11728 1726882180.21759: variable 'ansible_timeout' from source: unknown 11728 1726882180.21764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.21849: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.21862: variable 'omit' from source: magic vars 11728 1726882180.21870: starting attempt loop 11728 1726882180.21877: running the handler 11728 1726882180.21941: variable 'lsr_assert' from source: include params 11728 1726882180.21968: variable 'lsr_assert' from source: include params 11728 1726882180.21987: handler run complete 11728 1726882180.22005: attempt loop complete, returning result 11728 1726882180.22021: variable 'item' from source: unknown 11728 1726882180.22084: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_controller_device_present.yml", "tasks/assert_bond_port_profile_present.yml", "tasks/assert_bond_options.yml" ] } 11728 1726882180.22398: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.22401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.22409: variable 'omit' from source: magic vars 11728 1726882180.22411: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.22413: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.22415: variable 'omit' from source: magic vars 11728 1726882180.22417: variable 'omit' from source: magic vars 11728 1726882180.22450: variable 'item' from source: unknown 11728 1726882180.22508: variable 'item' from source: unknown 11728 1726882180.22524: variable 'omit' from source: magic vars 11728 1726882180.22547: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.22556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.22564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.22576: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.22582: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.22588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.22649: Set connection var ansible_connection to ssh 11728 1726882180.22659: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.22667: Set connection var ansible_timeout to 10 11728 1726882180.22672: Set connection var ansible_shell_type to sh 11728 1726882180.22681: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.22687: Set connection var ansible_pipelining to False 11728 1726882180.22710: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.22716: variable 'ansible_connection' from source: unknown 11728 1726882180.22721: variable 'ansible_module_compression' from source: unknown 11728 1726882180.22748: variable 'ansible_shell_type' from source: unknown 11728 1726882180.22750: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.22752: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.22754: variable 'ansible_pipelining' from source: unknown 11728 1726882180.22755: variable 'ansible_timeout' from source: unknown 11728 1726882180.22757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.22824: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.22834: variable 'omit' from source: magic vars 11728 1726882180.22856: starting attempt loop 11728 1726882180.22858: running the handler 11728 1726882180.22939: handler run complete 11728 1726882180.22952: attempt loop complete, returning result 11728 1726882180.22974: variable 'item' from source: unknown 11728 1726882180.23019: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11728 1726882180.23351: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.23354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.23356: variable 'omit' from source: magic vars 11728 1726882180.23359: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.23361: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.23363: variable 'omit' from source: magic vars 11728 1726882180.23365: variable 'omit' from source: magic vars 11728 1726882180.23367: variable 'item' from source: unknown 11728 1726882180.23369: variable 'item' from source: unknown 11728 1726882180.23371: variable 'omit' from source: magic vars 11728 1726882180.23374: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.23381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.23387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.23398: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.23401: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.23404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.23445: Set connection var ansible_connection to ssh 11728 1726882180.23452: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.23459: Set connection var ansible_timeout to 10 11728 1726882180.23461: Set connection var ansible_shell_type to sh 11728 1726882180.23466: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.23470: Set connection var ansible_pipelining to False 11728 1726882180.23485: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.23487: variable 'ansible_connection' from source: unknown 11728 1726882180.23490: variable 'ansible_module_compression' from source: unknown 11728 1726882180.23492: variable 'ansible_shell_type' from source: unknown 11728 1726882180.23498: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.23501: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.23503: variable 'ansible_pipelining' from source: unknown 11728 1726882180.23505: variable 'ansible_timeout' from source: unknown 11728 1726882180.23508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.23565: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.23571: variable 'omit' from source: magic vars 11728 1726882180.23574: starting attempt loop 11728 1726882180.23576: running the handler 11728 1726882180.23588: variable 'lsr_fail_debug' from source: play vars 11728 1726882180.23639: variable 'lsr_fail_debug' from source: play vars 11728 1726882180.23648: handler run complete 11728 1726882180.23657: attempt loop complete, returning result 11728 1726882180.23668: variable 'item' from source: unknown 11728 1726882180.23712: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11728 1726882180.23778: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.23781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.23786: variable 'omit' from source: magic vars 11728 1726882180.23879: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.23882: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.23887: variable 'omit' from source: magic vars 11728 1726882180.23905: variable 'omit' from source: magic vars 11728 1726882180.23928: variable 'item' from source: unknown 11728 1726882180.23969: variable 'item' from source: unknown 11728 1726882180.23979: variable 'omit' from source: magic vars 11728 1726882180.23992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.23999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.24012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.24016: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.24019: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.24022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.24061: Set connection var ansible_connection to ssh 11728 1726882180.24067: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.24072: Set connection var ansible_timeout to 10 11728 1726882180.24075: Set connection var ansible_shell_type to sh 11728 1726882180.24081: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.24085: Set connection var ansible_pipelining to False 11728 1726882180.24102: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.24105: variable 'ansible_connection' from source: unknown 11728 1726882180.24108: variable 'ansible_module_compression' from source: unknown 11728 1726882180.24110: variable 'ansible_shell_type' from source: unknown 11728 1726882180.24119: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.24121: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.24124: variable 'ansible_pipelining' from source: unknown 11728 1726882180.24125: variable 'ansible_timeout' from source: unknown 11728 1726882180.24128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.24192: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.24199: variable 'omit' from source: magic vars 11728 1726882180.24202: starting attempt loop 11728 1726882180.24204: running the handler 11728 1726882180.24234: variable 'lsr_cleanup' from source: include params 11728 1726882180.24300: variable 'lsr_cleanup' from source: include params 11728 1726882180.24309: handler run complete 11728 1726882180.24326: attempt loop complete, returning result 11728 1726882180.24499: variable 'item' from source: unknown 11728 1726882180.24503: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml" ] } 11728 1726882180.24558: dumping result to json 11728 1726882180.24560: done dumping result, returning 11728 1726882180.24562: done running TaskExecutor() for managed_node3/TASK: Show item [12673a56-9f93-5c28-a762-000000000089] 11728 1726882180.24564: sending task result for task 12673a56-9f93-5c28-a762-000000000089 11728 1726882180.24649: no more pending results, returning what we have 11728 1726882180.24651: results queue empty 11728 1726882180.24652: checking for any_errors_fatal 11728 1726882180.24656: done checking for any_errors_fatal 11728 1726882180.24657: checking for max_fail_percentage 11728 1726882180.24658: done checking for max_fail_percentage 11728 1726882180.24659: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.24659: done checking to see if all hosts have failed 11728 1726882180.24660: getting the remaining hosts for this loop 11728 1726882180.24661: done getting the remaining hosts for this loop 11728 1726882180.24664: getting the next task for host managed_node3 11728 1726882180.24669: done getting next task for host managed_node3 11728 1726882180.24671: ^ task is: TASK: Include the task 'show_interfaces.yml' 11728 1726882180.24673: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.24676: getting variables 11728 1726882180.24677: in VariableManager get_vars() 11728 1726882180.24704: Calling all_inventory to load vars for managed_node3 11728 1726882180.24707: Calling groups_inventory to load vars for managed_node3 11728 1726882180.24710: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.24720: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.24722: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.24725: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.24995: done sending task result for task 12673a56-9f93-5c28-a762-000000000089 11728 1726882180.24999: WORKER PROCESS EXITING 11728 1726882180.25012: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.25198: done with get_vars() 11728 1726882180.25207: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:29:40 -0400 (0:00:00.084) 0:00:05.105 ****** 11728 1726882180.25285: entering _queue_task() for managed_node3/include_tasks 11728 1726882180.25514: worker is 1 (out of 1 available) 11728 1726882180.25525: exiting _queue_task() for managed_node3/include_tasks 11728 1726882180.25537: done queuing things up, now waiting for results queue to drain 11728 1726882180.25538: waiting for pending results... 11728 1726882180.25771: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 11728 1726882180.25858: in run() - task 12673a56-9f93-5c28-a762-00000000008a 11728 1726882180.25877: variable 'ansible_search_path' from source: unknown 11728 1726882180.25881: variable 'ansible_search_path' from source: unknown 11728 1726882180.25906: calling self._execute() 11728 1726882180.25964: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.25968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.26000: variable 'omit' from source: magic vars 11728 1726882180.26235: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.26243: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.26249: _execute() done 11728 1726882180.26252: dumping result to json 11728 1726882180.26254: done dumping result, returning 11728 1726882180.26260: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-5c28-a762-00000000008a] 11728 1726882180.26265: sending task result for task 12673a56-9f93-5c28-a762-00000000008a 11728 1726882180.26371: no more pending results, returning what we have 11728 1726882180.26376: in VariableManager get_vars() 11728 1726882180.26410: Calling all_inventory to load vars for managed_node3 11728 1726882180.26414: Calling groups_inventory to load vars for managed_node3 11728 1726882180.26416: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.26425: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.26427: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.26430: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.26565: done sending task result for task 12673a56-9f93-5c28-a762-00000000008a 11728 1726882180.26568: WORKER PROCESS EXITING 11728 1726882180.26578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.26687: done with get_vars() 11728 1726882180.26697: variable 'ansible_search_path' from source: unknown 11728 1726882180.26698: variable 'ansible_search_path' from source: unknown 11728 1726882180.26723: we have included files to process 11728 1726882180.26724: generating all_blocks data 11728 1726882180.26725: done generating all_blocks data 11728 1726882180.26728: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882180.26729: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882180.26730: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882180.26826: in VariableManager get_vars() 11728 1726882180.26837: done with get_vars() 11728 1726882180.26909: done processing included file 11728 1726882180.26910: iterating over new_blocks loaded from include file 11728 1726882180.26911: in VariableManager get_vars() 11728 1726882180.26920: done with get_vars() 11728 1726882180.26921: filtering new block on tags 11728 1726882180.26940: done filtering new block on tags 11728 1726882180.26941: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 11728 1726882180.26945: extending task lists for all hosts with included blocks 11728 1726882180.27197: done extending task lists 11728 1726882180.27198: done processing included files 11728 1726882180.27198: results queue empty 11728 1726882180.27199: checking for any_errors_fatal 11728 1726882180.27201: done checking for any_errors_fatal 11728 1726882180.27202: checking for max_fail_percentage 11728 1726882180.27203: done checking for max_fail_percentage 11728 1726882180.27203: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.27204: done checking to see if all hosts have failed 11728 1726882180.27204: getting the remaining hosts for this loop 11728 1726882180.27205: done getting the remaining hosts for this loop 11728 1726882180.27206: getting the next task for host managed_node3 11728 1726882180.27209: done getting next task for host managed_node3 11728 1726882180.27210: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11728 1726882180.27212: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.27214: getting variables 11728 1726882180.27214: in VariableManager get_vars() 11728 1726882180.27220: Calling all_inventory to load vars for managed_node3 11728 1726882180.27222: Calling groups_inventory to load vars for managed_node3 11728 1726882180.27223: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.27226: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.27228: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.27231: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.27333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.27479: done with get_vars() 11728 1726882180.27485: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.022) 0:00:05.127 ****** 11728 1726882180.27552: entering _queue_task() for managed_node3/include_tasks 11728 1726882180.27755: worker is 1 (out of 1 available) 11728 1726882180.27766: exiting _queue_task() for managed_node3/include_tasks 11728 1726882180.27775: done queuing things up, now waiting for results queue to drain 11728 1726882180.27776: waiting for pending results... 11728 1726882180.28004: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 11728 1726882180.28152: in run() - task 12673a56-9f93-5c28-a762-0000000000b1 11728 1726882180.28157: variable 'ansible_search_path' from source: unknown 11728 1726882180.28160: variable 'ansible_search_path' from source: unknown 11728 1726882180.28162: calling self._execute() 11728 1726882180.28216: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.28242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.28248: variable 'omit' from source: magic vars 11728 1726882180.28598: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.28602: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.28604: _execute() done 11728 1726882180.28606: dumping result to json 11728 1726882180.28609: done dumping result, returning 11728 1726882180.28612: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-5c28-a762-0000000000b1] 11728 1726882180.28614: sending task result for task 12673a56-9f93-5c28-a762-0000000000b1 11728 1726882180.28669: done sending task result for task 12673a56-9f93-5c28-a762-0000000000b1 11728 1726882180.28672: WORKER PROCESS EXITING 11728 1726882180.28710: no more pending results, returning what we have 11728 1726882180.28716: in VariableManager get_vars() 11728 1726882180.28748: Calling all_inventory to load vars for managed_node3 11728 1726882180.28751: Calling groups_inventory to load vars for managed_node3 11728 1726882180.28755: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.28775: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.28779: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.28782: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.29081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.29382: done with get_vars() 11728 1726882180.29389: variable 'ansible_search_path' from source: unknown 11728 1726882180.29390: variable 'ansible_search_path' from source: unknown 11728 1726882180.29452: we have included files to process 11728 1726882180.29453: generating all_blocks data 11728 1726882180.29455: done generating all_blocks data 11728 1726882180.29457: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882180.29458: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882180.29462: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882180.29689: done processing included file 11728 1726882180.29691: iterating over new_blocks loaded from include file 11728 1726882180.29692: in VariableManager get_vars() 11728 1726882180.29704: done with get_vars() 11728 1726882180.29706: filtering new block on tags 11728 1726882180.29725: done filtering new block on tags 11728 1726882180.29727: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 11728 1726882180.29730: extending task lists for all hosts with included blocks 11728 1726882180.29824: done extending task lists 11728 1726882180.29825: done processing included files 11728 1726882180.29826: results queue empty 11728 1726882180.29827: checking for any_errors_fatal 11728 1726882180.29830: done checking for any_errors_fatal 11728 1726882180.29830: checking for max_fail_percentage 11728 1726882180.29831: done checking for max_fail_percentage 11728 1726882180.29832: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.29833: done checking to see if all hosts have failed 11728 1726882180.29833: getting the remaining hosts for this loop 11728 1726882180.29835: done getting the remaining hosts for this loop 11728 1726882180.29839: getting the next task for host managed_node3 11728 1726882180.29842: done getting next task for host managed_node3 11728 1726882180.29843: ^ task is: TASK: Gather current interface info 11728 1726882180.29845: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.29846: getting variables 11728 1726882180.29847: in VariableManager get_vars() 11728 1726882180.29852: Calling all_inventory to load vars for managed_node3 11728 1726882180.29854: Calling groups_inventory to load vars for managed_node3 11728 1726882180.29855: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.29858: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.29859: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.29861: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.29960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.30071: done with get_vars() 11728 1726882180.30078: done getting variables 11728 1726882180.30110: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.025) 0:00:05.153 ****** 11728 1726882180.30129: entering _queue_task() for managed_node3/command 11728 1726882180.30287: worker is 1 (out of 1 available) 11728 1726882180.30304: exiting _queue_task() for managed_node3/command 11728 1726882180.30313: done queuing things up, now waiting for results queue to drain 11728 1726882180.30314: waiting for pending results... 11728 1726882180.30451: running TaskExecutor() for managed_node3/TASK: Gather current interface info 11728 1726882180.30512: in run() - task 12673a56-9f93-5c28-a762-0000000000ec 11728 1726882180.30524: variable 'ansible_search_path' from source: unknown 11728 1726882180.30527: variable 'ansible_search_path' from source: unknown 11728 1726882180.30555: calling self._execute() 11728 1726882180.30609: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.30613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.30620: variable 'omit' from source: magic vars 11728 1726882180.30872: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.30879: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.30886: variable 'omit' from source: magic vars 11728 1726882180.30919: variable 'omit' from source: magic vars 11728 1726882180.30943: variable 'omit' from source: magic vars 11728 1726882180.30972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.31001: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.31016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.31029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.31039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.31060: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.31064: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.31066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.31134: Set connection var ansible_connection to ssh 11728 1726882180.31142: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.31147: Set connection var ansible_timeout to 10 11728 1726882180.31149: Set connection var ansible_shell_type to sh 11728 1726882180.31156: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.31160: Set connection var ansible_pipelining to False 11728 1726882180.31177: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.31180: variable 'ansible_connection' from source: unknown 11728 1726882180.31183: variable 'ansible_module_compression' from source: unknown 11728 1726882180.31185: variable 'ansible_shell_type' from source: unknown 11728 1726882180.31187: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.31189: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.31197: variable 'ansible_pipelining' from source: unknown 11728 1726882180.31199: variable 'ansible_timeout' from source: unknown 11728 1726882180.31201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.31417: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.31420: variable 'omit' from source: magic vars 11728 1726882180.31423: starting attempt loop 11728 1726882180.31426: running the handler 11728 1726882180.31428: _low_level_execute_command(): starting 11728 1726882180.31430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882180.32606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.32631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882180.32650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.32734: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.34460: stdout chunk (state=3): >>>/root <<< 11728 1726882180.34591: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.34845: stdout chunk (state=3): >>><<< 11728 1726882180.34849: stderr chunk (state=3): >>><<< 11728 1726882180.34853: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882180.34855: _low_level_execute_command(): starting 11728 1726882180.34859: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337 `" && echo ansible-tmp-1726882180.3474598-12025-229257935364337="` echo /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337 `" ) && sleep 0' 11728 1726882180.35511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882180.35525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882180.35541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882180.35614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.35668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.35688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882180.35725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.35875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.37743: stdout chunk (state=3): >>>ansible-tmp-1726882180.3474598-12025-229257935364337=/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337 <<< 11728 1726882180.37842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.37867: stderr chunk (state=3): >>><<< 11728 1726882180.37870: stdout chunk (state=3): >>><<< 11728 1726882180.37889: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882180.3474598-12025-229257935364337=/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882180.37922: variable 'ansible_module_compression' from source: unknown 11728 1726882180.37962: ANSIBALLZ: Using generic lock for ansible.legacy.command 11728 1726882180.37966: ANSIBALLZ: Acquiring lock 11728 1726882180.37968: ANSIBALLZ: Lock acquired: 139840770723472 11728 1726882180.37971: ANSIBALLZ: Creating module 11728 1726882180.50473: ANSIBALLZ: Writing module into payload 11728 1726882180.50535: ANSIBALLZ: Writing module 11728 1726882180.50551: ANSIBALLZ: Renaming module 11728 1726882180.50556: ANSIBALLZ: Done creating module 11728 1726882180.50570: variable 'ansible_facts' from source: unknown 11728 1726882180.50621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py 11728 1726882180.50718: Sending initial data 11728 1726882180.50721: Sent initial data (156 bytes) 11728 1726882180.51450: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882180.51454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.51457: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882180.51459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882180.51461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.51644: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.51648: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882180.51651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.51847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.53315: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882180.53322: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882180.53361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882180.53404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqn52y6h_ /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py <<< 11728 1726882180.53407: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py" <<< 11728 1726882180.53448: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqn52y6h_" to remote "/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py" <<< 11728 1726882180.54005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.54107: stderr chunk (state=3): >>><<< 11728 1726882180.54111: stdout chunk (state=3): >>><<< 11728 1726882180.54120: done transferring module to remote 11728 1726882180.54134: _low_level_execute_command(): starting 11728 1726882180.54145: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/ /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py && sleep 0' 11728 1726882180.55226: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.55230: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882180.55242: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882180.55252: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882180.55308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.55460: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.55512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.55550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.57288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.57303: stdout chunk (state=3): >>><<< 11728 1726882180.57314: stderr chunk (state=3): >>><<< 11728 1726882180.57332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882180.57341: _low_level_execute_command(): starting 11728 1726882180.57349: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/AnsiballZ_command.py && sleep 0' 11728 1726882180.57892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882180.57912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882180.57926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882180.57945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882180.57960: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882180.57971: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882180.57985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.58010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882180.58059: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.58111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882180.58130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882180.58156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.58241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.73453: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:29:40.729855", "end": "2024-09-20 21:29:40.732897", "delta": "0:00:00.003042", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882180.74934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882180.74955: stdout chunk (state=3): >>><<< 11728 1726882180.74984: stderr chunk (state=3): >>><<< 11728 1726882180.75014: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:29:40.729855", "end": "2024-09-20 21:29:40.732897", "delta": "0:00:00.003042", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882180.75151: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882180.75154: _low_level_execute_command(): starting 11728 1726882180.75157: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882180.3474598-12025-229257935364337/ > /dev/null 2>&1 && sleep 0' 11728 1726882180.76452: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882180.76666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882180.76783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882180.76865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882180.78664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882180.78668: stdout chunk (state=3): >>><<< 11728 1726882180.78670: stderr chunk (state=3): >>><<< 11728 1726882180.78910: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882180.78914: handler run complete 11728 1726882180.78916: Evaluated conditional (False): False 11728 1726882180.78919: attempt loop complete, returning result 11728 1726882180.78921: _execute() done 11728 1726882180.78922: dumping result to json 11728 1726882180.78924: done dumping result, returning 11728 1726882180.78926: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-5c28-a762-0000000000ec] 11728 1726882180.78928: sending task result for task 12673a56-9f93-5c28-a762-0000000000ec 11728 1726882180.79009: done sending task result for task 12673a56-9f93-5c28-a762-0000000000ec 11728 1726882180.79012: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003042", "end": "2024-09-20 21:29:40.732897", "rc": 0, "start": "2024-09-20 21:29:40.729855" } STDOUT: bonding_masters eth0 lo 11728 1726882180.79090: no more pending results, returning what we have 11728 1726882180.79096: results queue empty 11728 1726882180.79097: checking for any_errors_fatal 11728 1726882180.79099: done checking for any_errors_fatal 11728 1726882180.79100: checking for max_fail_percentage 11728 1726882180.79101: done checking for max_fail_percentage 11728 1726882180.79102: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.79103: done checking to see if all hosts have failed 11728 1726882180.79103: getting the remaining hosts for this loop 11728 1726882180.79105: done getting the remaining hosts for this loop 11728 1726882180.79109: getting the next task for host managed_node3 11728 1726882180.79116: done getting next task for host managed_node3 11728 1726882180.79119: ^ task is: TASK: Set current_interfaces 11728 1726882180.79124: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.79128: getting variables 11728 1726882180.79130: in VariableManager get_vars() 11728 1726882180.79162: Calling all_inventory to load vars for managed_node3 11728 1726882180.79165: Calling groups_inventory to load vars for managed_node3 11728 1726882180.79170: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.79182: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.79185: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.79188: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.79875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.81172: done with get_vars() 11728 1726882180.81183: done getting variables 11728 1726882180.81241: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:29:40 -0400 (0:00:00.511) 0:00:05.665 ****** 11728 1726882180.81274: entering _queue_task() for managed_node3/set_fact 11728 1726882180.81789: worker is 1 (out of 1 available) 11728 1726882180.81906: exiting _queue_task() for managed_node3/set_fact 11728 1726882180.81918: done queuing things up, now waiting for results queue to drain 11728 1726882180.81919: waiting for pending results... 11728 1726882180.82331: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 11728 1726882180.82336: in run() - task 12673a56-9f93-5c28-a762-0000000000ed 11728 1726882180.82340: variable 'ansible_search_path' from source: unknown 11728 1726882180.82343: variable 'ansible_search_path' from source: unknown 11728 1726882180.82361: calling self._execute() 11728 1726882180.82539: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.82700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.82704: variable 'omit' from source: magic vars 11728 1726882180.82942: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.82957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.82968: variable 'omit' from source: magic vars 11728 1726882180.83026: variable 'omit' from source: magic vars 11728 1726882180.83148: variable '_current_interfaces' from source: set_fact 11728 1726882180.83219: variable 'omit' from source: magic vars 11728 1726882180.83264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.83310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.83332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.83352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.83377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.83411: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.83421: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.83429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.83535: Set connection var ansible_connection to ssh 11728 1726882180.83550: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.83582: Set connection var ansible_timeout to 10 11728 1726882180.83585: Set connection var ansible_shell_type to sh 11728 1726882180.83587: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.83595: Set connection var ansible_pipelining to False 11728 1726882180.83622: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.83692: variable 'ansible_connection' from source: unknown 11728 1726882180.83697: variable 'ansible_module_compression' from source: unknown 11728 1726882180.83700: variable 'ansible_shell_type' from source: unknown 11728 1726882180.83702: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.83703: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.83705: variable 'ansible_pipelining' from source: unknown 11728 1726882180.83707: variable 'ansible_timeout' from source: unknown 11728 1726882180.83709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.83899: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.83904: variable 'omit' from source: magic vars 11728 1726882180.83906: starting attempt loop 11728 1726882180.83909: running the handler 11728 1726882180.83911: handler run complete 11728 1726882180.83913: attempt loop complete, returning result 11728 1726882180.83915: _execute() done 11728 1726882180.83917: dumping result to json 11728 1726882180.83919: done dumping result, returning 11728 1726882180.83922: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-5c28-a762-0000000000ed] 11728 1726882180.83924: sending task result for task 12673a56-9f93-5c28-a762-0000000000ed 11728 1726882180.84002: done sending task result for task 12673a56-9f93-5c28-a762-0000000000ed 11728 1726882180.84006: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 11728 1726882180.84095: no more pending results, returning what we have 11728 1726882180.84099: results queue empty 11728 1726882180.84100: checking for any_errors_fatal 11728 1726882180.84109: done checking for any_errors_fatal 11728 1726882180.84110: checking for max_fail_percentage 11728 1726882180.84112: done checking for max_fail_percentage 11728 1726882180.84112: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.84113: done checking to see if all hosts have failed 11728 1726882180.84114: getting the remaining hosts for this loop 11728 1726882180.84116: done getting the remaining hosts for this loop 11728 1726882180.84119: getting the next task for host managed_node3 11728 1726882180.84127: done getting next task for host managed_node3 11728 1726882180.84129: ^ task is: TASK: Show current_interfaces 11728 1726882180.84133: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.84138: getting variables 11728 1726882180.84139: in VariableManager get_vars() 11728 1726882180.84168: Calling all_inventory to load vars for managed_node3 11728 1726882180.84170: Calling groups_inventory to load vars for managed_node3 11728 1726882180.84174: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.84184: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.84187: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.84190: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.84591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.84810: done with get_vars() 11728 1726882180.84822: done getting variables 11728 1726882180.84882: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:29:40 -0400 (0:00:00.036) 0:00:05.701 ****** 11728 1726882180.84913: entering _queue_task() for managed_node3/debug 11728 1726882180.85149: worker is 1 (out of 1 available) 11728 1726882180.85161: exiting _queue_task() for managed_node3/debug 11728 1726882180.85171: done queuing things up, now waiting for results queue to drain 11728 1726882180.85173: waiting for pending results... 11728 1726882180.85520: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 11728 1726882180.85525: in run() - task 12673a56-9f93-5c28-a762-0000000000b2 11728 1726882180.85535: variable 'ansible_search_path' from source: unknown 11728 1726882180.85542: variable 'ansible_search_path' from source: unknown 11728 1726882180.85577: calling self._execute() 11728 1726882180.85659: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.85671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.85685: variable 'omit' from source: magic vars 11728 1726882180.86041: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.86065: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.86163: variable 'omit' from source: magic vars 11728 1726882180.86166: variable 'omit' from source: magic vars 11728 1726882180.86227: variable 'current_interfaces' from source: set_fact 11728 1726882180.86259: variable 'omit' from source: magic vars 11728 1726882180.86309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882180.86349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882180.86375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882180.86401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.86508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882180.86799: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882180.86802: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.86804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.86806: Set connection var ansible_connection to ssh 11728 1726882180.86808: Set connection var ansible_shell_executable to /bin/sh 11728 1726882180.86809: Set connection var ansible_timeout to 10 11728 1726882180.86811: Set connection var ansible_shell_type to sh 11728 1726882180.86813: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882180.86814: Set connection var ansible_pipelining to False 11728 1726882180.86816: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.86818: variable 'ansible_connection' from source: unknown 11728 1726882180.86821: variable 'ansible_module_compression' from source: unknown 11728 1726882180.86822: variable 'ansible_shell_type' from source: unknown 11728 1726882180.86824: variable 'ansible_shell_executable' from source: unknown 11728 1726882180.86825: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.86827: variable 'ansible_pipelining' from source: unknown 11728 1726882180.86829: variable 'ansible_timeout' from source: unknown 11728 1726882180.86830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.87098: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882180.87245: variable 'omit' from source: magic vars 11728 1726882180.87254: starting attempt loop 11728 1726882180.87259: running the handler 11728 1726882180.87304: handler run complete 11728 1726882180.87361: attempt loop complete, returning result 11728 1726882180.87369: _execute() done 11728 1726882180.87376: dumping result to json 11728 1726882180.87383: done dumping result, returning 11728 1726882180.87411: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-5c28-a762-0000000000b2] 11728 1726882180.87422: sending task result for task 12673a56-9f93-5c28-a762-0000000000b2 ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 11728 1726882180.87609: no more pending results, returning what we have 11728 1726882180.87613: results queue empty 11728 1726882180.87614: checking for any_errors_fatal 11728 1726882180.87622: done checking for any_errors_fatal 11728 1726882180.87623: checking for max_fail_percentage 11728 1726882180.87625: done checking for max_fail_percentage 11728 1726882180.87626: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.87626: done checking to see if all hosts have failed 11728 1726882180.87627: getting the remaining hosts for this loop 11728 1726882180.87629: done getting the remaining hosts for this loop 11728 1726882180.87633: getting the next task for host managed_node3 11728 1726882180.87641: done getting next task for host managed_node3 11728 1726882180.87645: ^ task is: TASK: Setup 11728 1726882180.87647: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.87652: getting variables 11728 1726882180.87654: in VariableManager get_vars() 11728 1726882180.87684: Calling all_inventory to load vars for managed_node3 11728 1726882180.87687: Calling groups_inventory to load vars for managed_node3 11728 1726882180.87691: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.88007: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.88011: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.88015: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.88555: done sending task result for task 12673a56-9f93-5c28-a762-0000000000b2 11728 1726882180.88558: WORKER PROCESS EXITING 11728 1726882180.88580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.89038: done with get_vars() 11728 1726882180.89048: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:29:40 -0400 (0:00:00.042) 0:00:05.743 ****** 11728 1726882180.89137: entering _queue_task() for managed_node3/include_tasks 11728 1726882180.89943: worker is 1 (out of 1 available) 11728 1726882180.89955: exiting _queue_task() for managed_node3/include_tasks 11728 1726882180.89965: done queuing things up, now waiting for results queue to drain 11728 1726882180.89966: waiting for pending results... 11728 1726882180.90466: running TaskExecutor() for managed_node3/TASK: Setup 11728 1726882180.90610: in run() - task 12673a56-9f93-5c28-a762-00000000008b 11728 1726882180.90716: variable 'ansible_search_path' from source: unknown 11728 1726882180.90724: variable 'ansible_search_path' from source: unknown 11728 1726882180.90779: variable 'lsr_setup' from source: include params 11728 1726882180.90974: variable 'lsr_setup' from source: include params 11728 1726882180.91046: variable 'omit' from source: magic vars 11728 1726882180.91600: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.91604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.91607: variable 'omit' from source: magic vars 11728 1726882180.91834: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.91848: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.91858: variable 'item' from source: unknown 11728 1726882180.91929: variable 'item' from source: unknown 11728 1726882180.92199: variable 'item' from source: unknown 11728 1726882180.92203: variable 'item' from source: unknown 11728 1726882180.92370: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.92700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.92703: variable 'omit' from source: magic vars 11728 1726882180.92773: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.92784: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.92796: variable 'item' from source: unknown 11728 1726882180.92859: variable 'item' from source: unknown 11728 1726882180.92998: variable 'item' from source: unknown 11728 1726882180.93059: variable 'item' from source: unknown 11728 1726882180.93343: dumping result to json 11728 1726882180.93345: done dumping result, returning 11728 1726882180.93347: done running TaskExecutor() for managed_node3/TASK: Setup [12673a56-9f93-5c28-a762-00000000008b] 11728 1726882180.93349: sending task result for task 12673a56-9f93-5c28-a762-00000000008b 11728 1726882180.93379: done sending task result for task 12673a56-9f93-5c28-a762-00000000008b 11728 1726882180.93382: WORKER PROCESS EXITING 11728 1726882180.93425: no more pending results, returning what we have 11728 1726882180.93429: in VariableManager get_vars() 11728 1726882180.93457: Calling all_inventory to load vars for managed_node3 11728 1726882180.93460: Calling groups_inventory to load vars for managed_node3 11728 1726882180.93462: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.93471: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.93473: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.93476: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.93840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.94231: done with get_vars() 11728 1726882180.94239: variable 'ansible_search_path' from source: unknown 11728 1726882180.94240: variable 'ansible_search_path' from source: unknown 11728 1726882180.94280: variable 'ansible_search_path' from source: unknown 11728 1726882180.94282: variable 'ansible_search_path' from source: unknown 11728 1726882180.94314: we have included files to process 11728 1726882180.94316: generating all_blocks data 11728 1726882180.94317: done generating all_blocks data 11728 1726882180.94322: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882180.94323: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882180.94325: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882180.95831: done processing included file 11728 1726882180.95833: iterating over new_blocks loaded from include file 11728 1726882180.95835: in VariableManager get_vars() 11728 1726882180.95854: done with get_vars() 11728 1726882180.95856: filtering new block on tags 11728 1726882180.95930: done filtering new block on tags 11728 1726882180.95933: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11728 1726882180.95938: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882180.95939: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882180.95943: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882180.96077: in VariableManager get_vars() 11728 1726882180.96100: done with get_vars() 11728 1726882180.96107: variable 'item' from source: include params 11728 1726882180.96220: variable 'item' from source: include params 11728 1726882180.96256: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11728 1726882180.96369: in VariableManager get_vars() 11728 1726882180.96388: done with get_vars() 11728 1726882180.96542: in VariableManager get_vars() 11728 1726882180.96559: done with get_vars() 11728 1726882180.96566: variable 'item' from source: include params 11728 1726882180.96633: variable 'item' from source: include params 11728 1726882180.96662: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11728 1726882180.96742: in VariableManager get_vars() 11728 1726882180.96760: done with get_vars() 11728 1726882180.96878: done processing included file 11728 1726882180.96880: iterating over new_blocks loaded from include file 11728 1726882180.96882: in VariableManager get_vars() 11728 1726882180.96896: done with get_vars() 11728 1726882180.96898: filtering new block on tags 11728 1726882180.97080: done filtering new block on tags 11728 1726882180.97084: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 11728 1726882180.97089: extending task lists for all hosts with included blocks 11728 1726882180.97942: done extending task lists 11728 1726882180.97944: done processing included files 11728 1726882180.97945: results queue empty 11728 1726882180.97946: checking for any_errors_fatal 11728 1726882180.97949: done checking for any_errors_fatal 11728 1726882180.97950: checking for max_fail_percentage 11728 1726882180.97951: done checking for max_fail_percentage 11728 1726882180.97952: checking to see if all hosts have failed and the running result is not ok 11728 1726882180.97952: done checking to see if all hosts have failed 11728 1726882180.97958: getting the remaining hosts for this loop 11728 1726882180.97960: done getting the remaining hosts for this loop 11728 1726882180.97962: getting the next task for host managed_node3 11728 1726882180.97966: done getting next task for host managed_node3 11728 1726882180.97968: ^ task is: TASK: Install dnsmasq 11728 1726882180.97970: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882180.97972: getting variables 11728 1726882180.97973: in VariableManager get_vars() 11728 1726882180.97981: Calling all_inventory to load vars for managed_node3 11728 1726882180.97983: Calling groups_inventory to load vars for managed_node3 11728 1726882180.97985: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882180.97991: Calling all_plugins_play to load vars for managed_node3 11728 1726882180.97997: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882180.98002: Calling groups_plugins_play to load vars for managed_node3 11728 1726882180.98151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882180.98345: done with get_vars() 11728 1726882180.98358: done getting variables 11728 1726882180.98398: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.092) 0:00:05.836 ****** 11728 1726882180.98426: entering _queue_task() for managed_node3/package 11728 1726882180.98735: worker is 1 (out of 1 available) 11728 1726882180.98747: exiting _queue_task() for managed_node3/package 11728 1726882180.98758: done queuing things up, now waiting for results queue to drain 11728 1726882180.98759: waiting for pending results... 11728 1726882180.99119: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 11728 1726882180.99226: in run() - task 12673a56-9f93-5c28-a762-000000000112 11728 1726882180.99246: variable 'ansible_search_path' from source: unknown 11728 1726882180.99298: variable 'ansible_search_path' from source: unknown 11728 1726882180.99302: calling self._execute() 11728 1726882180.99409: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882180.99421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882180.99436: variable 'omit' from source: magic vars 11728 1726882180.99768: variable 'ansible_distribution_major_version' from source: facts 11728 1726882180.99785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882180.99798: variable 'omit' from source: magic vars 11728 1726882180.99843: variable 'omit' from source: magic vars 11728 1726882181.00099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882181.02942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882181.03023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882181.03062: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882181.03103: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882181.03132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882181.03227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882181.03258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882181.03287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882181.03336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882181.03354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882181.03462: variable '__network_is_ostree' from source: set_fact 11728 1726882181.03599: variable 'omit' from source: magic vars 11728 1726882181.03602: variable 'omit' from source: magic vars 11728 1726882181.03605: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882181.03607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882181.03609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882181.03611: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882181.03613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882181.03645: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882181.03652: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.03659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.03754: Set connection var ansible_connection to ssh 11728 1726882181.03769: Set connection var ansible_shell_executable to /bin/sh 11728 1726882181.03778: Set connection var ansible_timeout to 10 11728 1726882181.03784: Set connection var ansible_shell_type to sh 11728 1726882181.03797: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882181.03807: Set connection var ansible_pipelining to False 11728 1726882181.03835: variable 'ansible_shell_executable' from source: unknown 11728 1726882181.03845: variable 'ansible_connection' from source: unknown 11728 1726882181.03852: variable 'ansible_module_compression' from source: unknown 11728 1726882181.03858: variable 'ansible_shell_type' from source: unknown 11728 1726882181.03863: variable 'ansible_shell_executable' from source: unknown 11728 1726882181.03870: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.03879: variable 'ansible_pipelining' from source: unknown 11728 1726882181.03885: variable 'ansible_timeout' from source: unknown 11728 1726882181.03892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.03987: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882181.04005: variable 'omit' from source: magic vars 11728 1726882181.04016: starting attempt loop 11728 1726882181.04024: running the handler 11728 1726882181.04034: variable 'ansible_facts' from source: unknown 11728 1726882181.04040: variable 'ansible_facts' from source: unknown 11728 1726882181.04092: _low_level_execute_command(): starting 11728 1726882181.04107: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882181.05242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882181.05258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882181.05274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.05290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882181.05513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.05536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.05613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.07259: stdout chunk (state=3): >>>/root <<< 11728 1726882181.07403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.07415: stdout chunk (state=3): >>><<< 11728 1726882181.07451: stderr chunk (state=3): >>><<< 11728 1726882181.07480: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.07567: _low_level_execute_command(): starting 11728 1726882181.07571: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358 `" && echo ansible-tmp-1726882181.0752137-12076-175562466670358="` echo /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358 `" ) && sleep 0' 11728 1726882181.08756: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.08801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.08827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.08900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.10761: stdout chunk (state=3): >>>ansible-tmp-1726882181.0752137-12076-175562466670358=/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358 <<< 11728 1726882181.10901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.10912: stdout chunk (state=3): >>><<< 11728 1726882181.10930: stderr chunk (state=3): >>><<< 11728 1726882181.10959: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.0752137-12076-175562466670358=/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.10990: variable 'ansible_module_compression' from source: unknown 11728 1726882181.11054: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11728 1726882181.11066: ANSIBALLZ: Acquiring lock 11728 1726882181.11072: ANSIBALLZ: Lock acquired: 139840770723472 11728 1726882181.11078: ANSIBALLZ: Creating module 11728 1726882181.24425: ANSIBALLZ: Writing module into payload 11728 1726882181.24558: ANSIBALLZ: Writing module 11728 1726882181.24574: ANSIBALLZ: Renaming module 11728 1726882181.24587: ANSIBALLZ: Done creating module 11728 1726882181.24604: variable 'ansible_facts' from source: unknown 11728 1726882181.24673: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py 11728 1726882181.24770: Sending initial data 11728 1726882181.24774: Sent initial data (152 bytes) 11728 1726882181.25204: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882181.25207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.25210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882181.25213: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.25215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.25270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882181.25276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.25279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.25323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.26863: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882181.26903: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882181.26951: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpscz93596 /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py <<< 11728 1726882181.26955: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py" <<< 11728 1726882181.26998: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpscz93596" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py" <<< 11728 1726882181.27671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.27706: stderr chunk (state=3): >>><<< 11728 1726882181.27709: stdout chunk (state=3): >>><<< 11728 1726882181.27725: done transferring module to remote 11728 1726882181.27735: _low_level_execute_command(): starting 11728 1726882181.27741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/ /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py && sleep 0' 11728 1726882181.28140: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.28143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.28146: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882181.28148: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.28150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.28207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.28209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.28252: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.29947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.29966: stderr chunk (state=3): >>><<< 11728 1726882181.29969: stdout chunk (state=3): >>><<< 11728 1726882181.29981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.29984: _low_level_execute_command(): starting 11728 1726882181.29988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/AnsiballZ_dnf.py && sleep 0' 11728 1726882181.30360: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.30389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882181.30399: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.30403: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.30405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.30506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.30510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.30598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.70212: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11728 1726882181.81069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882181.81100: stderr chunk (state=3): >>><<< 11728 1726882181.81104: stdout chunk (state=3): >>><<< 11728 1726882181.81120: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882181.81152: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882181.81158: _low_level_execute_command(): starting 11728 1726882181.81163: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.0752137-12076-175562466670358/ > /dev/null 2>&1 && sleep 0' 11728 1726882181.81579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.81582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882181.81616: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882181.81619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882181.81622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882181.81625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.81678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882181.81683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.81686: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.81731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.83519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.83541: stderr chunk (state=3): >>><<< 11728 1726882181.83544: stdout chunk (state=3): >>><<< 11728 1726882181.83556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.83564: handler run complete 11728 1726882181.83677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882181.83802: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882181.83828: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882181.83850: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882181.83872: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882181.83928: variable '__install_status' from source: unknown 11728 1726882181.83942: Evaluated conditional (__install_status is success): True 11728 1726882181.83954: attempt loop complete, returning result 11728 1726882181.83956: _execute() done 11728 1726882181.83959: dumping result to json 11728 1726882181.83963: done dumping result, returning 11728 1726882181.83971: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [12673a56-9f93-5c28-a762-000000000112] 11728 1726882181.83975: sending task result for task 12673a56-9f93-5c28-a762-000000000112 11728 1726882181.84113: done sending task result for task 12673a56-9f93-5c28-a762-000000000112 11728 1726882181.84116: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11728 1726882181.84191: no more pending results, returning what we have 11728 1726882181.84199: results queue empty 11728 1726882181.84200: checking for any_errors_fatal 11728 1726882181.84201: done checking for any_errors_fatal 11728 1726882181.84202: checking for max_fail_percentage 11728 1726882181.84203: done checking for max_fail_percentage 11728 1726882181.84204: checking to see if all hosts have failed and the running result is not ok 11728 1726882181.84204: done checking to see if all hosts have failed 11728 1726882181.84205: getting the remaining hosts for this loop 11728 1726882181.84207: done getting the remaining hosts for this loop 11728 1726882181.84210: getting the next task for host managed_node3 11728 1726882181.84215: done getting next task for host managed_node3 11728 1726882181.84217: ^ task is: TASK: Install pgrep, sysctl 11728 1726882181.84220: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882181.84223: getting variables 11728 1726882181.84225: in VariableManager get_vars() 11728 1726882181.84249: Calling all_inventory to load vars for managed_node3 11728 1726882181.84252: Calling groups_inventory to load vars for managed_node3 11728 1726882181.84255: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882181.84264: Calling all_plugins_play to load vars for managed_node3 11728 1726882181.84266: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882181.84269: Calling groups_plugins_play to load vars for managed_node3 11728 1726882181.84421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882181.84540: done with get_vars() 11728 1726882181.84548: done getting variables 11728 1726882181.84587: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:29:41 -0400 (0:00:00.861) 0:00:06.698 ****** 11728 1726882181.84611: entering _queue_task() for managed_node3/package 11728 1726882181.84809: worker is 1 (out of 1 available) 11728 1726882181.84822: exiting _queue_task() for managed_node3/package 11728 1726882181.84833: done queuing things up, now waiting for results queue to drain 11728 1726882181.84834: waiting for pending results... 11728 1726882181.84974: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11728 1726882181.85032: in run() - task 12673a56-9f93-5c28-a762-000000000113 11728 1726882181.85043: variable 'ansible_search_path' from source: unknown 11728 1726882181.85046: variable 'ansible_search_path' from source: unknown 11728 1726882181.85076: calling self._execute() 11728 1726882181.85135: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.85138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.85146: variable 'omit' from source: magic vars 11728 1726882181.85399: variable 'ansible_distribution_major_version' from source: facts 11728 1726882181.85407: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882181.85481: variable 'ansible_os_family' from source: facts 11728 1726882181.85485: Evaluated conditional (ansible_os_family == 'RedHat'): True 11728 1726882181.85600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882181.85813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882181.85846: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882181.85869: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882181.85892: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882181.85949: variable 'ansible_distribution_major_version' from source: facts 11728 1726882181.85958: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11728 1726882181.85961: when evaluation is False, skipping this task 11728 1726882181.85964: _execute() done 11728 1726882181.85966: dumping result to json 11728 1726882181.85968: done dumping result, returning 11728 1726882181.85975: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [12673a56-9f93-5c28-a762-000000000113] 11728 1726882181.85979: sending task result for task 12673a56-9f93-5c28-a762-000000000113 11728 1726882181.86061: done sending task result for task 12673a56-9f93-5c28-a762-000000000113 11728 1726882181.86064: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11728 1726882181.86107: no more pending results, returning what we have 11728 1726882181.86111: results queue empty 11728 1726882181.86112: checking for any_errors_fatal 11728 1726882181.86117: done checking for any_errors_fatal 11728 1726882181.86117: checking for max_fail_percentage 11728 1726882181.86119: done checking for max_fail_percentage 11728 1726882181.86119: checking to see if all hosts have failed and the running result is not ok 11728 1726882181.86120: done checking to see if all hosts have failed 11728 1726882181.86121: getting the remaining hosts for this loop 11728 1726882181.86122: done getting the remaining hosts for this loop 11728 1726882181.86125: getting the next task for host managed_node3 11728 1726882181.86131: done getting next task for host managed_node3 11728 1726882181.86133: ^ task is: TASK: Install pgrep, sysctl 11728 1726882181.86136: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882181.86139: getting variables 11728 1726882181.86140: in VariableManager get_vars() 11728 1726882181.86161: Calling all_inventory to load vars for managed_node3 11728 1726882181.86163: Calling groups_inventory to load vars for managed_node3 11728 1726882181.86166: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882181.86173: Calling all_plugins_play to load vars for managed_node3 11728 1726882181.86176: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882181.86178: Calling groups_plugins_play to load vars for managed_node3 11728 1726882181.86290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882181.86428: done with get_vars() 11728 1726882181.86436: done getting variables 11728 1726882181.86472: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:29:41 -0400 (0:00:00.018) 0:00:06.717 ****** 11728 1726882181.86492: entering _queue_task() for managed_node3/package 11728 1726882181.86665: worker is 1 (out of 1 available) 11728 1726882181.86681: exiting _queue_task() for managed_node3/package 11728 1726882181.86691: done queuing things up, now waiting for results queue to drain 11728 1726882181.86696: waiting for pending results... 11728 1726882181.86830: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11728 1726882181.86882: in run() - task 12673a56-9f93-5c28-a762-000000000114 11728 1726882181.86891: variable 'ansible_search_path' from source: unknown 11728 1726882181.86899: variable 'ansible_search_path' from source: unknown 11728 1726882181.86925: calling self._execute() 11728 1726882181.86977: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.86981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.86989: variable 'omit' from source: magic vars 11728 1726882181.87229: variable 'ansible_distribution_major_version' from source: facts 11728 1726882181.87237: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882181.87315: variable 'ansible_os_family' from source: facts 11728 1726882181.87318: Evaluated conditional (ansible_os_family == 'RedHat'): True 11728 1726882181.87432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882181.87618: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882181.87648: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882181.87671: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882181.87699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882181.87768: variable 'ansible_distribution_major_version' from source: facts 11728 1726882181.87778: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11728 1726882181.87783: variable 'omit' from source: magic vars 11728 1726882181.87818: variable 'omit' from source: magic vars 11728 1726882181.87918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882181.89600: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882181.89604: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882181.89613: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882181.89650: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882181.89680: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882181.89769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882181.89803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882181.89834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882181.89877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882181.89896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882181.89988: variable '__network_is_ostree' from source: set_fact 11728 1726882181.90005: variable 'omit' from source: magic vars 11728 1726882181.90038: variable 'omit' from source: magic vars 11728 1726882181.90068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882181.90101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882181.90124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882181.90144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882181.90198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882181.90202: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882181.90206: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.90208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.90305: Set connection var ansible_connection to ssh 11728 1726882181.90323: Set connection var ansible_shell_executable to /bin/sh 11728 1726882181.90338: Set connection var ansible_timeout to 10 11728 1726882181.90340: Set connection var ansible_shell_type to sh 11728 1726882181.90348: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882181.90352: Set connection var ansible_pipelining to False 11728 1726882181.90382: variable 'ansible_shell_executable' from source: unknown 11728 1726882181.90386: variable 'ansible_connection' from source: unknown 11728 1726882181.90389: variable 'ansible_module_compression' from source: unknown 11728 1726882181.90391: variable 'ansible_shell_type' from source: unknown 11728 1726882181.90398: variable 'ansible_shell_executable' from source: unknown 11728 1726882181.90401: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882181.90403: variable 'ansible_pipelining' from source: unknown 11728 1726882181.90412: variable 'ansible_timeout' from source: unknown 11728 1726882181.90414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882181.90483: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882181.90491: variable 'omit' from source: magic vars 11728 1726882181.90498: starting attempt loop 11728 1726882181.90501: running the handler 11728 1726882181.90507: variable 'ansible_facts' from source: unknown 11728 1726882181.90509: variable 'ansible_facts' from source: unknown 11728 1726882181.90549: _low_level_execute_command(): starting 11728 1726882181.90555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882181.91031: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882181.91034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.91037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882181.91039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882181.91049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.91099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882181.91105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.91151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.92721: stdout chunk (state=3): >>>/root <<< 11728 1726882181.92869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.92873: stdout chunk (state=3): >>><<< 11728 1726882181.92875: stderr chunk (state=3): >>><<< 11728 1726882181.92896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.92915: _low_level_execute_command(): starting 11728 1726882181.92998: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922 `" && echo ansible-tmp-1726882181.929035-12129-76546333922922="` echo /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922 `" ) && sleep 0' 11728 1726882181.93560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882181.93574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882181.93591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882181.93613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882181.93663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.93728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882181.93763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.93836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.95665: stdout chunk (state=3): >>>ansible-tmp-1726882181.929035-12129-76546333922922=/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922 <<< 11728 1726882181.95835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882181.95839: stdout chunk (state=3): >>><<< 11728 1726882181.95841: stderr chunk (state=3): >>><<< 11728 1726882181.95900: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.929035-12129-76546333922922=/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882181.95903: variable 'ansible_module_compression' from source: unknown 11728 1726882181.95997: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11728 1726882181.96325: variable 'ansible_facts' from source: unknown 11728 1726882181.96328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py 11728 1726882181.96454: Sending initial data 11728 1726882181.96458: Sent initial data (150 bytes) 11728 1726882181.97408: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882181.97451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882181.97469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882181.97489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882181.97564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882181.99085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882181.99125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882181.99165: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpy8u2k_vg /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py <<< 11728 1726882181.99184: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py" <<< 11728 1726882181.99232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpy8u2k_vg" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py" <<< 11728 1726882182.00329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.00358: stderr chunk (state=3): >>><<< 11728 1726882182.00370: stdout chunk (state=3): >>><<< 11728 1726882182.00410: done transferring module to remote 11728 1726882182.00425: _low_level_execute_command(): starting 11728 1726882182.00433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/ /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py && sleep 0' 11728 1726882182.01075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882182.01088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.01108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.01126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882182.01162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.01178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.01269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.01282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.01352: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.03110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.03134: stderr chunk (state=3): >>><<< 11728 1726882182.03143: stdout chunk (state=3): >>><<< 11728 1726882182.03166: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882182.03175: _low_level_execute_command(): starting 11728 1726882182.03187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/AnsiballZ_dnf.py && sleep 0' 11728 1726882182.03860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882182.03881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.03906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.03925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882182.03943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882182.04002: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.04066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.04090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.04113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.04224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.43410: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11728 1726882182.47361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882182.47387: stderr chunk (state=3): >>><<< 11728 1726882182.47390: stdout chunk (state=3): >>><<< 11728 1726882182.47413: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882182.47445: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882182.47451: _low_level_execute_command(): starting 11728 1726882182.47456: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.929035-12129-76546333922922/ > /dev/null 2>&1 && sleep 0' 11728 1726882182.47914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.47917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.47919: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.47921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.47923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.47974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.47977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.47983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.48031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.49818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.49843: stderr chunk (state=3): >>><<< 11728 1726882182.49846: stdout chunk (state=3): >>><<< 11728 1726882182.49865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882182.49868: handler run complete 11728 1726882182.49895: attempt loop complete, returning result 11728 1726882182.49899: _execute() done 11728 1726882182.49901: dumping result to json 11728 1726882182.49908: done dumping result, returning 11728 1726882182.49915: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [12673a56-9f93-5c28-a762-000000000114] 11728 1726882182.49919: sending task result for task 12673a56-9f93-5c28-a762-000000000114 11728 1726882182.50012: done sending task result for task 12673a56-9f93-5c28-a762-000000000114 11728 1726882182.50015: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11728 1726882182.50082: no more pending results, returning what we have 11728 1726882182.50085: results queue empty 11728 1726882182.50086: checking for any_errors_fatal 11728 1726882182.50091: done checking for any_errors_fatal 11728 1726882182.50091: checking for max_fail_percentage 11728 1726882182.50095: done checking for max_fail_percentage 11728 1726882182.50096: checking to see if all hosts have failed and the running result is not ok 11728 1726882182.50100: done checking to see if all hosts have failed 11728 1726882182.50101: getting the remaining hosts for this loop 11728 1726882182.50103: done getting the remaining hosts for this loop 11728 1726882182.50107: getting the next task for host managed_node3 11728 1726882182.50113: done getting next task for host managed_node3 11728 1726882182.50115: ^ task is: TASK: Create test interfaces 11728 1726882182.50118: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882182.50121: getting variables 11728 1726882182.50123: in VariableManager get_vars() 11728 1726882182.50154: Calling all_inventory to load vars for managed_node3 11728 1726882182.50156: Calling groups_inventory to load vars for managed_node3 11728 1726882182.50160: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882182.50169: Calling all_plugins_play to load vars for managed_node3 11728 1726882182.50172: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882182.50175: Calling groups_plugins_play to load vars for managed_node3 11728 1726882182.50324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882182.50441: done with get_vars() 11728 1726882182.50449: done getting variables 11728 1726882182.50517: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:29:42 -0400 (0:00:00.640) 0:00:07.357 ****** 11728 1726882182.50540: entering _queue_task() for managed_node3/shell 11728 1726882182.50545: Creating lock for shell 11728 1726882182.50744: worker is 1 (out of 1 available) 11728 1726882182.50758: exiting _queue_task() for managed_node3/shell 11728 1726882182.50770: done queuing things up, now waiting for results queue to drain 11728 1726882182.50771: waiting for pending results... 11728 1726882182.50924: running TaskExecutor() for managed_node3/TASK: Create test interfaces 11728 1726882182.50984: in run() - task 12673a56-9f93-5c28-a762-000000000115 11728 1726882182.51009: variable 'ansible_search_path' from source: unknown 11728 1726882182.51012: variable 'ansible_search_path' from source: unknown 11728 1726882182.51032: calling self._execute() 11728 1726882182.51087: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882182.51091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882182.51104: variable 'omit' from source: magic vars 11728 1726882182.51363: variable 'ansible_distribution_major_version' from source: facts 11728 1726882182.51372: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882182.51378: variable 'omit' from source: magic vars 11728 1726882182.51410: variable 'omit' from source: magic vars 11728 1726882182.51690: variable 'dhcp_interface1' from source: play vars 11728 1726882182.51695: variable 'dhcp_interface2' from source: play vars 11728 1726882182.51713: variable 'omit' from source: magic vars 11728 1726882182.51744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882182.51772: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882182.51788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882182.51806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882182.51815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882182.51837: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882182.51841: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882182.51843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882182.51912: Set connection var ansible_connection to ssh 11728 1726882182.51921: Set connection var ansible_shell_executable to /bin/sh 11728 1726882182.51926: Set connection var ansible_timeout to 10 11728 1726882182.51928: Set connection var ansible_shell_type to sh 11728 1726882182.51935: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882182.51940: Set connection var ansible_pipelining to False 11728 1726882182.51957: variable 'ansible_shell_executable' from source: unknown 11728 1726882182.51960: variable 'ansible_connection' from source: unknown 11728 1726882182.51963: variable 'ansible_module_compression' from source: unknown 11728 1726882182.51965: variable 'ansible_shell_type' from source: unknown 11728 1726882182.51967: variable 'ansible_shell_executable' from source: unknown 11728 1726882182.51969: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882182.51973: variable 'ansible_pipelining' from source: unknown 11728 1726882182.51976: variable 'ansible_timeout' from source: unknown 11728 1726882182.51980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882182.52080: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882182.52088: variable 'omit' from source: magic vars 11728 1726882182.52092: starting attempt loop 11728 1726882182.52096: running the handler 11728 1726882182.52108: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882182.52123: _low_level_execute_command(): starting 11728 1726882182.52130: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882182.52632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.52636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.52639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882182.52644: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.52699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.52702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.52707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.52752: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.54304: stdout chunk (state=3): >>>/root <<< 11728 1726882182.54404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.54434: stderr chunk (state=3): >>><<< 11728 1726882182.54437: stdout chunk (state=3): >>><<< 11728 1726882182.54455: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882182.54467: _low_level_execute_command(): starting 11728 1726882182.54472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334 `" && echo ansible-tmp-1726882182.5445511-12160-144520351575334="` echo /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334 `" ) && sleep 0' 11728 1726882182.54931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.54942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882182.54945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.54947: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882182.54949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.54995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.55000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.55005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.55052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.56874: stdout chunk (state=3): >>>ansible-tmp-1726882182.5445511-12160-144520351575334=/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334 <<< 11728 1726882182.56974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.56999: stderr chunk (state=3): >>><<< 11728 1726882182.57004: stdout chunk (state=3): >>><<< 11728 1726882182.57019: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882182.5445511-12160-144520351575334=/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882182.57042: variable 'ansible_module_compression' from source: unknown 11728 1726882182.57082: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882182.57107: variable 'ansible_facts' from source: unknown 11728 1726882182.57158: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py 11728 1726882182.57255: Sending initial data 11728 1726882182.57258: Sent initial data (156 bytes) 11728 1726882182.57674: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.57677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882182.57680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.57682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.57734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.57738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.57787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.59299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882182.59354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882182.59405: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpu5iydspc /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py <<< 11728 1726882182.59408: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py" <<< 11728 1726882182.59445: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpu5iydspc" to remote "/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py" <<< 11728 1726882182.60298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.60302: stdout chunk (state=3): >>><<< 11728 1726882182.60304: stderr chunk (state=3): >>><<< 11728 1726882182.60306: done transferring module to remote 11728 1726882182.60308: _low_level_execute_command(): starting 11728 1726882182.60310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/ /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py && sleep 0' 11728 1726882182.60895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882182.60981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.61021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.61036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882182.61054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.61133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882182.62862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882182.62881: stdout chunk (state=3): >>><<< 11728 1726882182.62905: stderr chunk (state=3): >>><<< 11728 1726882182.62926: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882182.62937: _low_level_execute_command(): starting 11728 1726882182.63018: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/AnsiballZ_command.py && sleep 0' 11728 1726882182.63574: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882182.63589: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882182.63613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882182.63633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.63650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882182.63710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882182.63723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882182.63776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.00010: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:42.784707", "end": "2024-09-20 21:29:43.997712", "delta": "0:00:01.213005", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882184.01647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882184.01676: stderr chunk (state=3): >>><<< 11728 1726882184.01679: stdout chunk (state=3): >>><<< 11728 1726882184.01710: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:42.784707", "end": "2024-09-20 21:29:43.997712", "delta": "0:00:01.213005", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882184.01747: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882184.01754: _low_level_execute_command(): starting 11728 1726882184.01759: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882182.5445511-12160-144520351575334/ > /dev/null 2>&1 && sleep 0' 11728 1726882184.02218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.02223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882184.02225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.02229: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.02231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.02282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.02288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.02291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.02334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.04187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.04217: stderr chunk (state=3): >>><<< 11728 1726882184.04220: stdout chunk (state=3): >>><<< 11728 1726882184.04234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.04240: handler run complete 11728 1726882184.04257: Evaluated conditional (False): False 11728 1726882184.04266: attempt loop complete, returning result 11728 1726882184.04268: _execute() done 11728 1726882184.04271: dumping result to json 11728 1726882184.04276: done dumping result, returning 11728 1726882184.04287: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [12673a56-9f93-5c28-a762-000000000115] 11728 1726882184.04290: sending task result for task 12673a56-9f93-5c28-a762-000000000115 11728 1726882184.04397: done sending task result for task 12673a56-9f93-5c28-a762-000000000115 11728 1726882184.04400: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.213005", "end": "2024-09-20 21:29:43.997712", "rc": 0, "start": "2024-09-20 21:29:42.784707" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 711 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 711 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11728 1726882184.04478: no more pending results, returning what we have 11728 1726882184.04482: results queue empty 11728 1726882184.04483: checking for any_errors_fatal 11728 1726882184.04490: done checking for any_errors_fatal 11728 1726882184.04490: checking for max_fail_percentage 11728 1726882184.04492: done checking for max_fail_percentage 11728 1726882184.04497: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.04497: done checking to see if all hosts have failed 11728 1726882184.04498: getting the remaining hosts for this loop 11728 1726882184.04500: done getting the remaining hosts for this loop 11728 1726882184.04504: getting the next task for host managed_node3 11728 1726882184.04513: done getting next task for host managed_node3 11728 1726882184.04517: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11728 1726882184.04521: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.04524: getting variables 11728 1726882184.04525: in VariableManager get_vars() 11728 1726882184.04550: Calling all_inventory to load vars for managed_node3 11728 1726882184.04552: Calling groups_inventory to load vars for managed_node3 11728 1726882184.04555: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.04564: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.04566: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.04568: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.04733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.04852: done with get_vars() 11728 1726882184.04859: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:44 -0400 (0:00:01.543) 0:00:08.901 ****** 11728 1726882184.04931: entering _queue_task() for managed_node3/include_tasks 11728 1726882184.05127: worker is 1 (out of 1 available) 11728 1726882184.05140: exiting _queue_task() for managed_node3/include_tasks 11728 1726882184.05151: done queuing things up, now waiting for results queue to drain 11728 1726882184.05152: waiting for pending results... 11728 1726882184.05304: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11728 1726882184.05369: in run() - task 12673a56-9f93-5c28-a762-00000000011c 11728 1726882184.05385: variable 'ansible_search_path' from source: unknown 11728 1726882184.05388: variable 'ansible_search_path' from source: unknown 11728 1726882184.05415: calling self._execute() 11728 1726882184.05471: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.05474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.05489: variable 'omit' from source: magic vars 11728 1726882184.05739: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.05748: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.05754: _execute() done 11728 1726882184.05757: dumping result to json 11728 1726882184.05760: done dumping result, returning 11728 1726882184.05766: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-5c28-a762-00000000011c] 11728 1726882184.05771: sending task result for task 12673a56-9f93-5c28-a762-00000000011c 11728 1726882184.05858: done sending task result for task 12673a56-9f93-5c28-a762-00000000011c 11728 1726882184.05860: WORKER PROCESS EXITING 11728 1726882184.05884: no more pending results, returning what we have 11728 1726882184.05889: in VariableManager get_vars() 11728 1726882184.05921: Calling all_inventory to load vars for managed_node3 11728 1726882184.05924: Calling groups_inventory to load vars for managed_node3 11728 1726882184.05926: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.05935: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.05938: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.05940: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.06063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.06175: done with get_vars() 11728 1726882184.06181: variable 'ansible_search_path' from source: unknown 11728 1726882184.06181: variable 'ansible_search_path' from source: unknown 11728 1726882184.06211: we have included files to process 11728 1726882184.06211: generating all_blocks data 11728 1726882184.06213: done generating all_blocks data 11728 1726882184.06217: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.06218: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.06219: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.06370: done processing included file 11728 1726882184.06372: iterating over new_blocks loaded from include file 11728 1726882184.06373: in VariableManager get_vars() 11728 1726882184.06383: done with get_vars() 11728 1726882184.06384: filtering new block on tags 11728 1726882184.06405: done filtering new block on tags 11728 1726882184.06407: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11728 1726882184.06411: extending task lists for all hosts with included blocks 11728 1726882184.06558: done extending task lists 11728 1726882184.06559: done processing included files 11728 1726882184.06559: results queue empty 11728 1726882184.06560: checking for any_errors_fatal 11728 1726882184.06563: done checking for any_errors_fatal 11728 1726882184.06563: checking for max_fail_percentage 11728 1726882184.06564: done checking for max_fail_percentage 11728 1726882184.06565: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.06565: done checking to see if all hosts have failed 11728 1726882184.06566: getting the remaining hosts for this loop 11728 1726882184.06567: done getting the remaining hosts for this loop 11728 1726882184.06568: getting the next task for host managed_node3 11728 1726882184.06571: done getting next task for host managed_node3 11728 1726882184.06572: ^ task is: TASK: Get stat for interface {{ interface }} 11728 1726882184.06575: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.06576: getting variables 11728 1726882184.06577: in VariableManager get_vars() 11728 1726882184.06582: Calling all_inventory to load vars for managed_node3 11728 1726882184.06583: Calling groups_inventory to load vars for managed_node3 11728 1726882184.06585: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.06588: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.06589: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.06591: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.06671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.06778: done with get_vars() 11728 1726882184.06785: done getting variables 11728 1726882184.06897: variable 'interface' from source: task vars 11728 1726882184.06900: variable 'dhcp_interface1' from source: play vars 11728 1726882184.06943: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:44 -0400 (0:00:00.020) 0:00:08.922 ****** 11728 1726882184.06965: entering _queue_task() for managed_node3/stat 11728 1726882184.07152: worker is 1 (out of 1 available) 11728 1726882184.07163: exiting _queue_task() for managed_node3/stat 11728 1726882184.07175: done queuing things up, now waiting for results queue to drain 11728 1726882184.07176: waiting for pending results... 11728 1726882184.07323: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 11728 1726882184.07400: in run() - task 12673a56-9f93-5c28-a762-00000000017b 11728 1726882184.07414: variable 'ansible_search_path' from source: unknown 11728 1726882184.07417: variable 'ansible_search_path' from source: unknown 11728 1726882184.07440: calling self._execute() 11728 1726882184.07491: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.07501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.07506: variable 'omit' from source: magic vars 11728 1726882184.07744: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.07754: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.07759: variable 'omit' from source: magic vars 11728 1726882184.07799: variable 'omit' from source: magic vars 11728 1726882184.07864: variable 'interface' from source: task vars 11728 1726882184.07868: variable 'dhcp_interface1' from source: play vars 11728 1726882184.07913: variable 'dhcp_interface1' from source: play vars 11728 1726882184.07926: variable 'omit' from source: magic vars 11728 1726882184.07958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882184.07983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882184.08000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882184.08013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.08024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.08045: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882184.08048: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.08051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.08118: Set connection var ansible_connection to ssh 11728 1726882184.08126: Set connection var ansible_shell_executable to /bin/sh 11728 1726882184.08131: Set connection var ansible_timeout to 10 11728 1726882184.08134: Set connection var ansible_shell_type to sh 11728 1726882184.08140: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882184.08145: Set connection var ansible_pipelining to False 11728 1726882184.08162: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.08167: variable 'ansible_connection' from source: unknown 11728 1726882184.08169: variable 'ansible_module_compression' from source: unknown 11728 1726882184.08171: variable 'ansible_shell_type' from source: unknown 11728 1726882184.08173: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.08175: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.08177: variable 'ansible_pipelining' from source: unknown 11728 1726882184.08180: variable 'ansible_timeout' from source: unknown 11728 1726882184.08182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.08324: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882184.08332: variable 'omit' from source: magic vars 11728 1726882184.08337: starting attempt loop 11728 1726882184.08340: running the handler 11728 1726882184.08352: _low_level_execute_command(): starting 11728 1726882184.08359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882184.08867: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.08871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.08874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882184.08876: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.08932: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.08935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.08937: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.08996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.10573: stdout chunk (state=3): >>>/root <<< 11728 1726882184.10669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.10698: stderr chunk (state=3): >>><<< 11728 1726882184.10709: stdout chunk (state=3): >>><<< 11728 1726882184.10728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.10738: _low_level_execute_command(): starting 11728 1726882184.10743: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295 `" && echo ansible-tmp-1726882184.1072662-12212-63322708455295="` echo /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295 `" ) && sleep 0' 11728 1726882184.11191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.11197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882184.11208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.11210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.11212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882184.11214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.11259: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.11267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.11270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.11313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.13157: stdout chunk (state=3): >>>ansible-tmp-1726882184.1072662-12212-63322708455295=/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295 <<< 11728 1726882184.13263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.13298: stderr chunk (state=3): >>><<< 11728 1726882184.13303: stdout chunk (state=3): >>><<< 11728 1726882184.13316: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882184.1072662-12212-63322708455295=/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.13352: variable 'ansible_module_compression' from source: unknown 11728 1726882184.13395: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882184.13427: variable 'ansible_facts' from source: unknown 11728 1726882184.13487: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py 11728 1726882184.13589: Sending initial data 11728 1726882184.13596: Sent initial data (152 bytes) 11728 1726882184.14058: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.14061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882184.14065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.14067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.14069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882184.14071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.14118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.14121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.14171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.15680: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882184.15722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882184.15767: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9gvxs9w2 /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py <<< 11728 1726882184.15770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py" <<< 11728 1726882184.15816: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9gvxs9w2" to remote "/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py" <<< 11728 1726882184.16363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.16406: stderr chunk (state=3): >>><<< 11728 1726882184.16409: stdout chunk (state=3): >>><<< 11728 1726882184.16432: done transferring module to remote 11728 1726882184.16449: _low_level_execute_command(): starting 11728 1726882184.16452: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/ /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py && sleep 0' 11728 1726882184.16881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.16884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882184.16886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.16896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.16899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.16940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.16944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.16992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.18689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.18714: stderr chunk (state=3): >>><<< 11728 1726882184.18718: stdout chunk (state=3): >>><<< 11728 1726882184.18733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.18736: _low_level_execute_command(): starting 11728 1726882184.18739: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/AnsiballZ_stat.py && sleep 0' 11728 1726882184.19132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.19136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.19147: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.19200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.19204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.19260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.34456: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27428, "dev": 23, "nlink": 1, "atime": 1726882182.7909007, "mtime": 1726882182.7909007, "ctime": 1726882182.7909007, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882184.35906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882184.35910: stdout chunk (state=3): >>><<< 11728 1726882184.35912: stderr chunk (state=3): >>><<< 11728 1726882184.35915: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27428, "dev": 23, "nlink": 1, "atime": 1726882182.7909007, "mtime": 1726882182.7909007, "ctime": 1726882182.7909007, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882184.35917: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882184.35924: _low_level_execute_command(): starting 11728 1726882184.35927: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882184.1072662-12212-63322708455295/ > /dev/null 2>&1 && sleep 0' 11728 1726882184.36863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882184.36881: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.36913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.37016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.37033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.37050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.37072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.37148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.39012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.39036: stderr chunk (state=3): >>><<< 11728 1726882184.39049: stdout chunk (state=3): >>><<< 11728 1726882184.39068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.39078: handler run complete 11728 1726882184.39137: attempt loop complete, returning result 11728 1726882184.39144: _execute() done 11728 1726882184.39151: dumping result to json 11728 1726882184.39160: done dumping result, returning 11728 1726882184.39236: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [12673a56-9f93-5c28-a762-00000000017b] 11728 1726882184.39240: sending task result for task 12673a56-9f93-5c28-a762-00000000017b 11728 1726882184.39315: done sending task result for task 12673a56-9f93-5c28-a762-00000000017b 11728 1726882184.39318: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882182.7909007, "block_size": 4096, "blocks": 0, "ctime": 1726882182.7909007, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27428, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882182.7909007, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11728 1726882184.39412: no more pending results, returning what we have 11728 1726882184.39416: results queue empty 11728 1726882184.39417: checking for any_errors_fatal 11728 1726882184.39418: done checking for any_errors_fatal 11728 1726882184.39419: checking for max_fail_percentage 11728 1726882184.39421: done checking for max_fail_percentage 11728 1726882184.39422: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.39422: done checking to see if all hosts have failed 11728 1726882184.39423: getting the remaining hosts for this loop 11728 1726882184.39425: done getting the remaining hosts for this loop 11728 1726882184.39430: getting the next task for host managed_node3 11728 1726882184.39438: done getting next task for host managed_node3 11728 1726882184.39441: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11728 1726882184.39444: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.39448: getting variables 11728 1726882184.39450: in VariableManager get_vars() 11728 1726882184.39479: Calling all_inventory to load vars for managed_node3 11728 1726882184.39481: Calling groups_inventory to load vars for managed_node3 11728 1726882184.39484: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.39701: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.39705: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.39709: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.40176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.40384: done with get_vars() 11728 1726882184.40395: done getting variables 11728 1726882184.40489: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11728 1726882184.40615: variable 'interface' from source: task vars 11728 1726882184.40619: variable 'dhcp_interface1' from source: play vars 11728 1726882184.40679: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:44 -0400 (0:00:00.337) 0:00:09.259 ****** 11728 1726882184.40711: entering _queue_task() for managed_node3/assert 11728 1726882184.40713: Creating lock for assert 11728 1726882184.40956: worker is 1 (out of 1 available) 11728 1726882184.40968: exiting _queue_task() for managed_node3/assert 11728 1726882184.40979: done queuing things up, now waiting for results queue to drain 11728 1726882184.40980: waiting for pending results... 11728 1726882184.41346: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 11728 1726882184.41445: in run() - task 12673a56-9f93-5c28-a762-00000000011d 11728 1726882184.41450: variable 'ansible_search_path' from source: unknown 11728 1726882184.41452: variable 'ansible_search_path' from source: unknown 11728 1726882184.41481: calling self._execute() 11728 1726882184.41566: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.41598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.41601: variable 'omit' from source: magic vars 11728 1726882184.41957: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.41989: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.41995: variable 'omit' from source: magic vars 11728 1726882184.42100: variable 'omit' from source: magic vars 11728 1726882184.42158: variable 'interface' from source: task vars 11728 1726882184.42168: variable 'dhcp_interface1' from source: play vars 11728 1726882184.42240: variable 'dhcp_interface1' from source: play vars 11728 1726882184.42263: variable 'omit' from source: magic vars 11728 1726882184.42314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882184.42359: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882184.42428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882184.42431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.42434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.42462: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882184.42471: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.42479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.42573: Set connection var ansible_connection to ssh 11728 1726882184.42587: Set connection var ansible_shell_executable to /bin/sh 11728 1726882184.42598: Set connection var ansible_timeout to 10 11728 1726882184.42604: Set connection var ansible_shell_type to sh 11728 1726882184.42614: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882184.42644: Set connection var ansible_pipelining to False 11728 1726882184.42652: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.42658: variable 'ansible_connection' from source: unknown 11728 1726882184.42664: variable 'ansible_module_compression' from source: unknown 11728 1726882184.42669: variable 'ansible_shell_type' from source: unknown 11728 1726882184.42754: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.42758: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.42760: variable 'ansible_pipelining' from source: unknown 11728 1726882184.42762: variable 'ansible_timeout' from source: unknown 11728 1726882184.42764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.42834: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882184.42849: variable 'omit' from source: magic vars 11728 1726882184.42864: starting attempt loop 11728 1726882184.42871: running the handler 11728 1726882184.43001: variable 'interface_stat' from source: set_fact 11728 1726882184.43028: Evaluated conditional (interface_stat.stat.exists): True 11728 1726882184.43039: handler run complete 11728 1726882184.43056: attempt loop complete, returning result 11728 1726882184.43062: _execute() done 11728 1726882184.43068: dumping result to json 11728 1726882184.43082: done dumping result, returning 11728 1726882184.43092: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [12673a56-9f93-5c28-a762-00000000011d] 11728 1726882184.43105: sending task result for task 12673a56-9f93-5c28-a762-00000000011d 11728 1726882184.43253: done sending task result for task 12673a56-9f93-5c28-a762-00000000011d 11728 1726882184.43256: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882184.43338: no more pending results, returning what we have 11728 1726882184.43342: results queue empty 11728 1726882184.43343: checking for any_errors_fatal 11728 1726882184.43353: done checking for any_errors_fatal 11728 1726882184.43353: checking for max_fail_percentage 11728 1726882184.43355: done checking for max_fail_percentage 11728 1726882184.43356: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.43357: done checking to see if all hosts have failed 11728 1726882184.43358: getting the remaining hosts for this loop 11728 1726882184.43360: done getting the remaining hosts for this loop 11728 1726882184.43363: getting the next task for host managed_node3 11728 1726882184.43372: done getting next task for host managed_node3 11728 1726882184.43374: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11728 1726882184.43379: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.43382: getting variables 11728 1726882184.43384: in VariableManager get_vars() 11728 1726882184.43413: Calling all_inventory to load vars for managed_node3 11728 1726882184.43416: Calling groups_inventory to load vars for managed_node3 11728 1726882184.43420: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.43430: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.43433: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.43436: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.43720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.43963: done with get_vars() 11728 1726882184.43972: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:44 -0400 (0:00:00.033) 0:00:09.293 ****** 11728 1726882184.44068: entering _queue_task() for managed_node3/include_tasks 11728 1726882184.44390: worker is 1 (out of 1 available) 11728 1726882184.44402: exiting _queue_task() for managed_node3/include_tasks 11728 1726882184.44411: done queuing things up, now waiting for results queue to drain 11728 1726882184.44412: waiting for pending results... 11728 1726882184.44603: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11728 1726882184.44679: in run() - task 12673a56-9f93-5c28-a762-000000000121 11728 1726882184.44702: variable 'ansible_search_path' from source: unknown 11728 1726882184.44710: variable 'ansible_search_path' from source: unknown 11728 1726882184.44799: calling self._execute() 11728 1726882184.44829: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.44839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.44855: variable 'omit' from source: magic vars 11728 1726882184.45209: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.45228: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.45245: _execute() done 11728 1726882184.45256: dumping result to json 11728 1726882184.45265: done dumping result, returning 11728 1726882184.45279: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-5c28-a762-000000000121] 11728 1726882184.45353: sending task result for task 12673a56-9f93-5c28-a762-000000000121 11728 1726882184.45418: done sending task result for task 12673a56-9f93-5c28-a762-000000000121 11728 1726882184.45421: WORKER PROCESS EXITING 11728 1726882184.45445: no more pending results, returning what we have 11728 1726882184.45450: in VariableManager get_vars() 11728 1726882184.45487: Calling all_inventory to load vars for managed_node3 11728 1726882184.45490: Calling groups_inventory to load vars for managed_node3 11728 1726882184.45495: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.45508: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.45510: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.45513: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.45911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.46106: done with get_vars() 11728 1726882184.46118: variable 'ansible_search_path' from source: unknown 11728 1726882184.46120: variable 'ansible_search_path' from source: unknown 11728 1726882184.46154: we have included files to process 11728 1726882184.46155: generating all_blocks data 11728 1726882184.46157: done generating all_blocks data 11728 1726882184.46161: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.46162: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.46164: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882184.46357: done processing included file 11728 1726882184.46359: iterating over new_blocks loaded from include file 11728 1726882184.46360: in VariableManager get_vars() 11728 1726882184.46375: done with get_vars() 11728 1726882184.46377: filtering new block on tags 11728 1726882184.46409: done filtering new block on tags 11728 1726882184.46412: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11728 1726882184.46417: extending task lists for all hosts with included blocks 11728 1726882184.46638: done extending task lists 11728 1726882184.46639: done processing included files 11728 1726882184.46640: results queue empty 11728 1726882184.46640: checking for any_errors_fatal 11728 1726882184.46643: done checking for any_errors_fatal 11728 1726882184.46644: checking for max_fail_percentage 11728 1726882184.46645: done checking for max_fail_percentage 11728 1726882184.46646: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.46646: done checking to see if all hosts have failed 11728 1726882184.46647: getting the remaining hosts for this loop 11728 1726882184.46648: done getting the remaining hosts for this loop 11728 1726882184.46651: getting the next task for host managed_node3 11728 1726882184.46661: done getting next task for host managed_node3 11728 1726882184.46663: ^ task is: TASK: Get stat for interface {{ interface }} 11728 1726882184.46667: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.46670: getting variables 11728 1726882184.46671: in VariableManager get_vars() 11728 1726882184.46679: Calling all_inventory to load vars for managed_node3 11728 1726882184.46681: Calling groups_inventory to load vars for managed_node3 11728 1726882184.46683: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.46687: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.46689: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.46692: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.46833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.47029: done with get_vars() 11728 1726882184.47037: done getting variables 11728 1726882184.47185: variable 'interface' from source: task vars 11728 1726882184.47188: variable 'dhcp_interface2' from source: play vars 11728 1726882184.47254: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:44 -0400 (0:00:00.032) 0:00:09.325 ****** 11728 1726882184.47286: entering _queue_task() for managed_node3/stat 11728 1726882184.47617: worker is 1 (out of 1 available) 11728 1726882184.47632: exiting _queue_task() for managed_node3/stat 11728 1726882184.47641: done queuing things up, now waiting for results queue to drain 11728 1726882184.47643: waiting for pending results... 11728 1726882184.47864: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 11728 1726882184.47942: in run() - task 12673a56-9f93-5c28-a762-00000000019f 11728 1726882184.47967: variable 'ansible_search_path' from source: unknown 11728 1726882184.47974: variable 'ansible_search_path' from source: unknown 11728 1726882184.48072: calling self._execute() 11728 1726882184.48097: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.48109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.48121: variable 'omit' from source: magic vars 11728 1726882184.48549: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.48564: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.48576: variable 'omit' from source: magic vars 11728 1726882184.48654: variable 'omit' from source: magic vars 11728 1726882184.48757: variable 'interface' from source: task vars 11728 1726882184.48766: variable 'dhcp_interface2' from source: play vars 11728 1726882184.48838: variable 'dhcp_interface2' from source: play vars 11728 1726882184.48943: variable 'omit' from source: magic vars 11728 1726882184.48946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882184.48949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882184.48968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882184.48989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.49009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.49039: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882184.49054: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.49061: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.49157: Set connection var ansible_connection to ssh 11728 1726882184.49176: Set connection var ansible_shell_executable to /bin/sh 11728 1726882184.49185: Set connection var ansible_timeout to 10 11728 1726882184.49191: Set connection var ansible_shell_type to sh 11728 1726882184.49204: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882184.49213: Set connection var ansible_pipelining to False 11728 1726882184.49239: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.49246: variable 'ansible_connection' from source: unknown 11728 1726882184.49252: variable 'ansible_module_compression' from source: unknown 11728 1726882184.49258: variable 'ansible_shell_type' from source: unknown 11728 1726882184.49271: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.49378: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.49381: variable 'ansible_pipelining' from source: unknown 11728 1726882184.49384: variable 'ansible_timeout' from source: unknown 11728 1726882184.49386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.49492: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882184.49509: variable 'omit' from source: magic vars 11728 1726882184.49518: starting attempt loop 11728 1726882184.49524: running the handler 11728 1726882184.49540: _low_level_execute_command(): starting 11728 1726882184.49552: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882184.50314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.50520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.50544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.50707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.52324: stdout chunk (state=3): >>>/root <<< 11728 1726882184.52457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.52476: stderr chunk (state=3): >>><<< 11728 1726882184.52762: stdout chunk (state=3): >>><<< 11728 1726882184.52769: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.52772: _low_level_execute_command(): starting 11728 1726882184.52775: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410 `" && echo ansible-tmp-1726882184.5258296-12226-135783861944410="` echo /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410 `" ) && sleep 0' 11728 1726882184.54113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.54223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.54244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.54255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.54371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.56233: stdout chunk (state=3): >>>ansible-tmp-1726882184.5258296-12226-135783861944410=/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410 <<< 11728 1726882184.56354: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.56412: stderr chunk (state=3): >>><<< 11728 1726882184.56429: stdout chunk (state=3): >>><<< 11728 1726882184.56450: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882184.5258296-12226-135783861944410=/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.56513: variable 'ansible_module_compression' from source: unknown 11728 1726882184.56576: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882184.56629: variable 'ansible_facts' from source: unknown 11728 1726882184.56760: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py 11728 1726882184.56931: Sending initial data 11728 1726882184.57015: Sent initial data (153 bytes) 11728 1726882184.58005: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.58030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.58108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.59635: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882184.59673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882184.59723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbs8vvdp2 /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py <<< 11728 1726882184.59736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py" <<< 11728 1726882184.59772: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11728 1726882184.59792: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbs8vvdp2" to remote "/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py" <<< 11728 1726882184.60519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.60691: stderr chunk (state=3): >>><<< 11728 1726882184.60697: stdout chunk (state=3): >>><<< 11728 1726882184.60699: done transferring module to remote 11728 1726882184.60701: _low_level_execute_command(): starting 11728 1726882184.60704: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/ /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py && sleep 0' 11728 1726882184.61280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882184.61358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.61418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.61430: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.61466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.61536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.63302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.63305: stdout chunk (state=3): >>><<< 11728 1726882184.63308: stderr chunk (state=3): >>><<< 11728 1726882184.63400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.63403: _low_level_execute_command(): starting 11728 1726882184.63406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/AnsiballZ_stat.py && sleep 0' 11728 1726882184.64010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882184.64026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.64066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.64078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882184.64182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.64228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.64271: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.79318: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27834, "dev": 23, "nlink": 1, "atime": 1726882182.7964191, "mtime": 1726882182.7964191, "ctime": 1726882182.7964191, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882184.80620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882184.80645: stdout chunk (state=3): >>><<< 11728 1726882184.80648: stderr chunk (state=3): >>><<< 11728 1726882184.80703: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27834, "dev": 23, "nlink": 1, "atime": 1726882182.7964191, "mtime": 1726882182.7964191, "ctime": 1726882182.7964191, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882184.80735: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882184.80755: _low_level_execute_command(): starting 11728 1726882184.80764: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882184.5258296-12226-135783861944410/ > /dev/null 2>&1 && sleep 0' 11728 1726882184.81408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882184.81423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882184.81448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882184.81466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882184.81484: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882184.81564: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882184.81603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882184.81623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882184.81645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882184.81732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882184.83599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882184.83610: stdout chunk (state=3): >>><<< 11728 1726882184.83625: stderr chunk (state=3): >>><<< 11728 1726882184.83651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882184.83663: handler run complete 11728 1726882184.83716: attempt loop complete, returning result 11728 1726882184.83798: _execute() done 11728 1726882184.83801: dumping result to json 11728 1726882184.83804: done dumping result, returning 11728 1726882184.83806: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [12673a56-9f93-5c28-a762-00000000019f] 11728 1726882184.83808: sending task result for task 12673a56-9f93-5c28-a762-00000000019f 11728 1726882184.83886: done sending task result for task 12673a56-9f93-5c28-a762-00000000019f ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882182.7964191, "block_size": 4096, "blocks": 0, "ctime": 1726882182.7964191, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27834, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882182.7964191, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11728 1726882184.84063: no more pending results, returning what we have 11728 1726882184.84067: results queue empty 11728 1726882184.84068: checking for any_errors_fatal 11728 1726882184.84070: done checking for any_errors_fatal 11728 1726882184.84070: checking for max_fail_percentage 11728 1726882184.84073: done checking for max_fail_percentage 11728 1726882184.84074: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.84074: done checking to see if all hosts have failed 11728 1726882184.84075: getting the remaining hosts for this loop 11728 1726882184.84077: done getting the remaining hosts for this loop 11728 1726882184.84080: getting the next task for host managed_node3 11728 1726882184.84089: done getting next task for host managed_node3 11728 1726882184.84091: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11728 1726882184.84305: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.84309: getting variables 11728 1726882184.84311: in VariableManager get_vars() 11728 1726882184.84337: Calling all_inventory to load vars for managed_node3 11728 1726882184.84340: Calling groups_inventory to load vars for managed_node3 11728 1726882184.84343: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.84353: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.84355: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.84359: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.84605: WORKER PROCESS EXITING 11728 1726882184.84634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.84845: done with get_vars() 11728 1726882184.84859: done getting variables 11728 1726882184.84918: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882184.85043: variable 'interface' from source: task vars 11728 1726882184.85047: variable 'dhcp_interface2' from source: play vars 11728 1726882184.85113: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:44 -0400 (0:00:00.378) 0:00:09.703 ****** 11728 1726882184.85145: entering _queue_task() for managed_node3/assert 11728 1726882184.85513: worker is 1 (out of 1 available) 11728 1726882184.85524: exiting _queue_task() for managed_node3/assert 11728 1726882184.85534: done queuing things up, now waiting for results queue to drain 11728 1726882184.85535: waiting for pending results... 11728 1726882184.85690: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 11728 1726882184.85820: in run() - task 12673a56-9f93-5c28-a762-000000000122 11728 1726882184.85846: variable 'ansible_search_path' from source: unknown 11728 1726882184.85856: variable 'ansible_search_path' from source: unknown 11728 1726882184.85902: calling self._execute() 11728 1726882184.86048: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.86052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.86055: variable 'omit' from source: magic vars 11728 1726882184.86384: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.86405: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.86423: variable 'omit' from source: magic vars 11728 1726882184.86492: variable 'omit' from source: magic vars 11728 1726882184.86600: variable 'interface' from source: task vars 11728 1726882184.86611: variable 'dhcp_interface2' from source: play vars 11728 1726882184.86679: variable 'dhcp_interface2' from source: play vars 11728 1726882184.86742: variable 'omit' from source: magic vars 11728 1726882184.86757: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882184.86798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882184.86829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882184.86856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.86919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882184.86922: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882184.86925: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.86927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.87032: Set connection var ansible_connection to ssh 11728 1726882184.87048: Set connection var ansible_shell_executable to /bin/sh 11728 1726882184.87060: Set connection var ansible_timeout to 10 11728 1726882184.87073: Set connection var ansible_shell_type to sh 11728 1726882184.87085: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882184.87137: Set connection var ansible_pipelining to False 11728 1726882184.87140: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.87142: variable 'ansible_connection' from source: unknown 11728 1726882184.87144: variable 'ansible_module_compression' from source: unknown 11728 1726882184.87146: variable 'ansible_shell_type' from source: unknown 11728 1726882184.87151: variable 'ansible_shell_executable' from source: unknown 11728 1726882184.87159: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.87167: variable 'ansible_pipelining' from source: unknown 11728 1726882184.87180: variable 'ansible_timeout' from source: unknown 11728 1726882184.87188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.87337: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882184.87464: variable 'omit' from source: magic vars 11728 1726882184.87467: starting attempt loop 11728 1726882184.87470: running the handler 11728 1726882184.87507: variable 'interface_stat' from source: set_fact 11728 1726882184.87531: Evaluated conditional (interface_stat.stat.exists): True 11728 1726882184.87542: handler run complete 11728 1726882184.87560: attempt loop complete, returning result 11728 1726882184.87574: _execute() done 11728 1726882184.87583: dumping result to json 11728 1726882184.87590: done dumping result, returning 11728 1726882184.87603: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [12673a56-9f93-5c28-a762-000000000122] 11728 1726882184.87614: sending task result for task 12673a56-9f93-5c28-a762-000000000122 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882184.87830: no more pending results, returning what we have 11728 1726882184.87834: results queue empty 11728 1726882184.87835: checking for any_errors_fatal 11728 1726882184.87844: done checking for any_errors_fatal 11728 1726882184.87845: checking for max_fail_percentage 11728 1726882184.87846: done checking for max_fail_percentage 11728 1726882184.87847: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.87848: done checking to see if all hosts have failed 11728 1726882184.87849: getting the remaining hosts for this loop 11728 1726882184.87850: done getting the remaining hosts for this loop 11728 1726882184.87854: getting the next task for host managed_node3 11728 1726882184.87863: done getting next task for host managed_node3 11728 1726882184.87866: ^ task is: TASK: Test 11728 1726882184.87869: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.87874: getting variables 11728 1726882184.87875: in VariableManager get_vars() 11728 1726882184.87905: Calling all_inventory to load vars for managed_node3 11728 1726882184.87909: Calling groups_inventory to load vars for managed_node3 11728 1726882184.87914: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.87925: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.87929: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.87933: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.88225: done sending task result for task 12673a56-9f93-5c28-a762-000000000122 11728 1726882184.88229: WORKER PROCESS EXITING 11728 1726882184.88249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.88477: done with get_vars() 11728 1726882184.88486: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:29:44 -0400 (0:00:00.034) 0:00:09.738 ****** 11728 1726882184.88572: entering _queue_task() for managed_node3/include_tasks 11728 1726882184.88890: worker is 1 (out of 1 available) 11728 1726882184.88902: exiting _queue_task() for managed_node3/include_tasks 11728 1726882184.88911: done queuing things up, now waiting for results queue to drain 11728 1726882184.88912: waiting for pending results... 11728 1726882184.89055: running TaskExecutor() for managed_node3/TASK: Test 11728 1726882184.89154: in run() - task 12673a56-9f93-5c28-a762-00000000008c 11728 1726882184.89173: variable 'ansible_search_path' from source: unknown 11728 1726882184.89181: variable 'ansible_search_path' from source: unknown 11728 1726882184.89231: variable 'lsr_test' from source: include params 11728 1726882184.89423: variable 'lsr_test' from source: include params 11728 1726882184.89490: variable 'omit' from source: magic vars 11728 1726882184.89613: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.89632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.89649: variable 'omit' from source: magic vars 11728 1726882184.89878: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.89898: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.89961: variable 'item' from source: unknown 11728 1726882184.89976: variable 'item' from source: unknown 11728 1726882184.90015: variable 'item' from source: unknown 11728 1726882184.90080: variable 'item' from source: unknown 11728 1726882184.90315: dumping result to json 11728 1726882184.90318: done dumping result, returning 11728 1726882184.90321: done running TaskExecutor() for managed_node3/TASK: Test [12673a56-9f93-5c28-a762-00000000008c] 11728 1726882184.90323: sending task result for task 12673a56-9f93-5c28-a762-00000000008c 11728 1726882184.90362: done sending task result for task 12673a56-9f93-5c28-a762-00000000008c 11728 1726882184.90364: WORKER PROCESS EXITING 11728 1726882184.90385: no more pending results, returning what we have 11728 1726882184.90396: in VariableManager get_vars() 11728 1726882184.90428: Calling all_inventory to load vars for managed_node3 11728 1726882184.90431: Calling groups_inventory to load vars for managed_node3 11728 1726882184.90434: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.90446: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.90448: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.90451: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.90745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.90941: done with get_vars() 11728 1726882184.90948: variable 'ansible_search_path' from source: unknown 11728 1726882184.90949: variable 'ansible_search_path' from source: unknown 11728 1726882184.90983: we have included files to process 11728 1726882184.90984: generating all_blocks data 11728 1726882184.90986: done generating all_blocks data 11728 1726882184.90989: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11728 1726882184.90991: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11728 1726882184.90995: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml 11728 1726882184.91405: done processing included file 11728 1726882184.91407: iterating over new_blocks loaded from include file 11728 1726882184.91409: in VariableManager get_vars() 11728 1726882184.91422: done with get_vars() 11728 1726882184.91424: filtering new block on tags 11728 1726882184.91461: done filtering new block on tags 11728 1726882184.91463: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml for managed_node3 => (item=tasks/create_bond_profile.yml) 11728 1726882184.91468: extending task lists for all hosts with included blocks 11728 1726882184.92776: done extending task lists 11728 1726882184.92777: done processing included files 11728 1726882184.92778: results queue empty 11728 1726882184.92779: checking for any_errors_fatal 11728 1726882184.92781: done checking for any_errors_fatal 11728 1726882184.92782: checking for max_fail_percentage 11728 1726882184.92783: done checking for max_fail_percentage 11728 1726882184.92784: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.92785: done checking to see if all hosts have failed 11728 1726882184.92785: getting the remaining hosts for this loop 11728 1726882184.92787: done getting the remaining hosts for this loop 11728 1726882184.92789: getting the next task for host managed_node3 11728 1726882184.92795: done getting next task for host managed_node3 11728 1726882184.92797: ^ task is: TASK: Include network role 11728 1726882184.92800: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.92802: getting variables 11728 1726882184.92803: in VariableManager get_vars() 11728 1726882184.92811: Calling all_inventory to load vars for managed_node3 11728 1726882184.92813: Calling groups_inventory to load vars for managed_node3 11728 1726882184.92816: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.92821: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.92823: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.92826: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.92974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.93170: done with get_vars() 11728 1726882184.93178: done getting variables TASK [Include network role] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:3 Friday 20 September 2024 21:29:44 -0400 (0:00:00.046) 0:00:09.784 ****** 11728 1726882184.93253: entering _queue_task() for managed_node3/include_role 11728 1726882184.93255: Creating lock for include_role 11728 1726882184.93631: worker is 1 (out of 1 available) 11728 1726882184.93642: exiting _queue_task() for managed_node3/include_role 11728 1726882184.93652: done queuing things up, now waiting for results queue to drain 11728 1726882184.93653: waiting for pending results... 11728 1726882184.93865: running TaskExecutor() for managed_node3/TASK: Include network role 11728 1726882184.93924: in run() - task 12673a56-9f93-5c28-a762-0000000001c5 11728 1726882184.93961: variable 'ansible_search_path' from source: unknown 11728 1726882184.93964: variable 'ansible_search_path' from source: unknown 11728 1726882184.93994: calling self._execute() 11728 1726882184.94095: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882184.94099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882184.94101: variable 'omit' from source: magic vars 11728 1726882184.94452: variable 'ansible_distribution_major_version' from source: facts 11728 1726882184.94468: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882184.94478: _execute() done 11728 1726882184.94506: dumping result to json 11728 1726882184.94509: done dumping result, returning 11728 1726882184.94512: done running TaskExecutor() for managed_node3/TASK: Include network role [12673a56-9f93-5c28-a762-0000000001c5] 11728 1726882184.94514: sending task result for task 12673a56-9f93-5c28-a762-0000000001c5 11728 1726882184.94782: done sending task result for task 12673a56-9f93-5c28-a762-0000000001c5 11728 1726882184.94785: WORKER PROCESS EXITING 11728 1726882184.94812: no more pending results, returning what we have 11728 1726882184.94816: in VariableManager get_vars() 11728 1726882184.94852: Calling all_inventory to load vars for managed_node3 11728 1726882184.94855: Calling groups_inventory to load vars for managed_node3 11728 1726882184.94858: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.94870: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.94872: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.94874: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.95204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882184.95399: done with get_vars() 11728 1726882184.95406: variable 'ansible_search_path' from source: unknown 11728 1726882184.95407: variable 'ansible_search_path' from source: unknown 11728 1726882184.95595: variable 'omit' from source: magic vars 11728 1726882184.95635: variable 'omit' from source: magic vars 11728 1726882184.95649: variable 'omit' from source: magic vars 11728 1726882184.95653: we have included files to process 11728 1726882184.95654: generating all_blocks data 11728 1726882184.95655: done generating all_blocks data 11728 1726882184.95656: processing included file: fedora.linux_system_roles.network 11728 1726882184.95677: in VariableManager get_vars() 11728 1726882184.95688: done with get_vars() 11728 1726882184.95758: in VariableManager get_vars() 11728 1726882184.95775: done with get_vars() 11728 1726882184.95827: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11728 1726882184.96084: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11728 1726882184.96221: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11728 1726882184.96867: in VariableManager get_vars() 11728 1726882184.96886: done with get_vars() 11728 1726882184.97313: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882184.98997: iterating over new_blocks loaded from include file 11728 1726882184.99000: in VariableManager get_vars() 11728 1726882184.99018: done with get_vars() 11728 1726882184.99020: filtering new block on tags 11728 1726882184.99329: done filtering new block on tags 11728 1726882184.99333: in VariableManager get_vars() 11728 1726882184.99347: done with get_vars() 11728 1726882184.99349: filtering new block on tags 11728 1726882184.99366: done filtering new block on tags 11728 1726882184.99367: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed_node3 11728 1726882184.99373: extending task lists for all hosts with included blocks 11728 1726882184.99547: done extending task lists 11728 1726882184.99549: done processing included files 11728 1726882184.99550: results queue empty 11728 1726882184.99551: checking for any_errors_fatal 11728 1726882184.99555: done checking for any_errors_fatal 11728 1726882184.99556: checking for max_fail_percentage 11728 1726882184.99557: done checking for max_fail_percentage 11728 1726882184.99558: checking to see if all hosts have failed and the running result is not ok 11728 1726882184.99559: done checking to see if all hosts have failed 11728 1726882184.99559: getting the remaining hosts for this loop 11728 1726882184.99561: done getting the remaining hosts for this loop 11728 1726882184.99563: getting the next task for host managed_node3 11728 1726882184.99568: done getting next task for host managed_node3 11728 1726882184.99571: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882184.99574: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882184.99584: getting variables 11728 1726882184.99585: in VariableManager get_vars() 11728 1726882184.99600: Calling all_inventory to load vars for managed_node3 11728 1726882184.99603: Calling groups_inventory to load vars for managed_node3 11728 1726882184.99605: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882184.99615: Calling all_plugins_play to load vars for managed_node3 11728 1726882184.99618: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882184.99622: Calling groups_plugins_play to load vars for managed_node3 11728 1726882184.99999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.00226: done with get_vars() 11728 1726882185.00237: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:29:45 -0400 (0:00:00.070) 0:00:09.855 ****** 11728 1726882185.00325: entering _queue_task() for managed_node3/include_tasks 11728 1726882185.00830: worker is 1 (out of 1 available) 11728 1726882185.00839: exiting _queue_task() for managed_node3/include_tasks 11728 1726882185.00849: done queuing things up, now waiting for results queue to drain 11728 1726882185.00850: waiting for pending results... 11728 1726882185.01095: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882185.01102: in run() - task 12673a56-9f93-5c28-a762-000000000277 11728 1726882185.01106: variable 'ansible_search_path' from source: unknown 11728 1726882185.01113: variable 'ansible_search_path' from source: unknown 11728 1726882185.01157: calling self._execute() 11728 1726882185.01249: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.01260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.01273: variable 'omit' from source: magic vars 11728 1726882185.01643: variable 'ansible_distribution_major_version' from source: facts 11728 1726882185.01658: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882185.01668: _execute() done 11728 1726882185.01674: dumping result to json 11728 1726882185.01680: done dumping result, returning 11728 1726882185.01691: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-5c28-a762-000000000277] 11728 1726882185.01728: sending task result for task 12673a56-9f93-5c28-a762-000000000277 11728 1726882185.01869: no more pending results, returning what we have 11728 1726882185.01874: in VariableManager get_vars() 11728 1726882185.01920: Calling all_inventory to load vars for managed_node3 11728 1726882185.01923: Calling groups_inventory to load vars for managed_node3 11728 1726882185.01925: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882185.01937: Calling all_plugins_play to load vars for managed_node3 11728 1726882185.01940: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882185.01942: Calling groups_plugins_play to load vars for managed_node3 11728 1726882185.02400: done sending task result for task 12673a56-9f93-5c28-a762-000000000277 11728 1726882185.02403: WORKER PROCESS EXITING 11728 1726882185.02431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.02696: done with get_vars() 11728 1726882185.02705: variable 'ansible_search_path' from source: unknown 11728 1726882185.02706: variable 'ansible_search_path' from source: unknown 11728 1726882185.02748: we have included files to process 11728 1726882185.02749: generating all_blocks data 11728 1726882185.02751: done generating all_blocks data 11728 1726882185.02759: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882185.02760: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882185.02763: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882185.03501: done processing included file 11728 1726882185.03503: iterating over new_blocks loaded from include file 11728 1726882185.03504: in VariableManager get_vars() 11728 1726882185.03534: done with get_vars() 11728 1726882185.03536: filtering new block on tags 11728 1726882185.03566: done filtering new block on tags 11728 1726882185.03570: in VariableManager get_vars() 11728 1726882185.03595: done with get_vars() 11728 1726882185.03597: filtering new block on tags 11728 1726882185.03648: done filtering new block on tags 11728 1726882185.03650: in VariableManager get_vars() 11728 1726882185.03672: done with get_vars() 11728 1726882185.03674: filtering new block on tags 11728 1726882185.03716: done filtering new block on tags 11728 1726882185.03719: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11728 1726882185.03725: extending task lists for all hosts with included blocks 11728 1726882185.05418: done extending task lists 11728 1726882185.05420: done processing included files 11728 1726882185.05421: results queue empty 11728 1726882185.05422: checking for any_errors_fatal 11728 1726882185.05425: done checking for any_errors_fatal 11728 1726882185.05426: checking for max_fail_percentage 11728 1726882185.05427: done checking for max_fail_percentage 11728 1726882185.05428: checking to see if all hosts have failed and the running result is not ok 11728 1726882185.05428: done checking to see if all hosts have failed 11728 1726882185.05429: getting the remaining hosts for this loop 11728 1726882185.05431: done getting the remaining hosts for this loop 11728 1726882185.05433: getting the next task for host managed_node3 11728 1726882185.05439: done getting next task for host managed_node3 11728 1726882185.05442: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882185.05447: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882185.05458: getting variables 11728 1726882185.05459: in VariableManager get_vars() 11728 1726882185.05480: Calling all_inventory to load vars for managed_node3 11728 1726882185.05482: Calling groups_inventory to load vars for managed_node3 11728 1726882185.05484: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882185.05490: Calling all_plugins_play to load vars for managed_node3 11728 1726882185.05494: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882185.05498: Calling groups_plugins_play to load vars for managed_node3 11728 1726882185.05644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.05816: done with get_vars() 11728 1726882185.05826: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:29:45 -0400 (0:00:00.055) 0:00:09.911 ****** 11728 1726882185.05901: entering _queue_task() for managed_node3/setup 11728 1726882185.06217: worker is 1 (out of 1 available) 11728 1726882185.06229: exiting _queue_task() for managed_node3/setup 11728 1726882185.06354: done queuing things up, now waiting for results queue to drain 11728 1726882185.06356: waiting for pending results... 11728 1726882185.06531: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882185.06692: in run() - task 12673a56-9f93-5c28-a762-0000000002d4 11728 1726882185.06715: variable 'ansible_search_path' from source: unknown 11728 1726882185.06722: variable 'ansible_search_path' from source: unknown 11728 1726882185.06762: calling self._execute() 11728 1726882185.06850: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.06862: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.06874: variable 'omit' from source: magic vars 11728 1726882185.07255: variable 'ansible_distribution_major_version' from source: facts 11728 1726882185.07326: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882185.07532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882185.09646: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882185.09737: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882185.09781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882185.09830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882185.09862: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882185.10099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882185.10103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882185.10105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882185.10107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882185.10109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882185.10138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882185.10167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882185.10199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882185.10251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882185.10271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882185.10441: variable '__network_required_facts' from source: role '' defaults 11728 1726882185.10459: variable 'ansible_facts' from source: unknown 11728 1726882185.10560: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11728 1726882185.10570: when evaluation is False, skipping this task 11728 1726882185.10578: _execute() done 11728 1726882185.10585: dumping result to json 11728 1726882185.10594: done dumping result, returning 11728 1726882185.10608: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-5c28-a762-0000000002d4] 11728 1726882185.10617: sending task result for task 12673a56-9f93-5c28-a762-0000000002d4 11728 1726882185.10735: done sending task result for task 12673a56-9f93-5c28-a762-0000000002d4 11728 1726882185.10738: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882185.10817: no more pending results, returning what we have 11728 1726882185.10822: results queue empty 11728 1726882185.10823: checking for any_errors_fatal 11728 1726882185.10824: done checking for any_errors_fatal 11728 1726882185.10825: checking for max_fail_percentage 11728 1726882185.10827: done checking for max_fail_percentage 11728 1726882185.10828: checking to see if all hosts have failed and the running result is not ok 11728 1726882185.10828: done checking to see if all hosts have failed 11728 1726882185.10829: getting the remaining hosts for this loop 11728 1726882185.10831: done getting the remaining hosts for this loop 11728 1726882185.10835: getting the next task for host managed_node3 11728 1726882185.10846: done getting next task for host managed_node3 11728 1726882185.10851: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882185.10858: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882185.10986: getting variables 11728 1726882185.10988: in VariableManager get_vars() 11728 1726882185.11030: Calling all_inventory to load vars for managed_node3 11728 1726882185.11033: Calling groups_inventory to load vars for managed_node3 11728 1726882185.11036: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882185.11046: Calling all_plugins_play to load vars for managed_node3 11728 1726882185.11050: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882185.11061: Calling groups_plugins_play to load vars for managed_node3 11728 1726882185.11466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.11685: done with get_vars() 11728 1726882185.11699: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:29:45 -0400 (0:00:00.058) 0:00:09.970 ****** 11728 1726882185.11792: entering _queue_task() for managed_node3/stat 11728 1726882185.12041: worker is 1 (out of 1 available) 11728 1726882185.12053: exiting _queue_task() for managed_node3/stat 11728 1726882185.12065: done queuing things up, now waiting for results queue to drain 11728 1726882185.12066: waiting for pending results... 11728 1726882185.12422: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882185.12459: in run() - task 12673a56-9f93-5c28-a762-0000000002d6 11728 1726882185.12480: variable 'ansible_search_path' from source: unknown 11728 1726882185.12504: variable 'ansible_search_path' from source: unknown 11728 1726882185.12543: calling self._execute() 11728 1726882185.12699: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.12703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.12706: variable 'omit' from source: magic vars 11728 1726882185.13026: variable 'ansible_distribution_major_version' from source: facts 11728 1726882185.13042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882185.13213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882185.13482: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882185.13535: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882185.13572: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882185.13617: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882185.13736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882185.13765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882185.13797: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882185.13898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882185.13925: variable '__network_is_ostree' from source: set_fact 11728 1726882185.13946: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882185.13954: when evaluation is False, skipping this task 11728 1726882185.13962: _execute() done 11728 1726882185.13970: dumping result to json 11728 1726882185.13977: done dumping result, returning 11728 1726882185.13988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-5c28-a762-0000000002d6] 11728 1726882185.14002: sending task result for task 12673a56-9f93-5c28-a762-0000000002d6 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882185.14245: no more pending results, returning what we have 11728 1726882185.14250: results queue empty 11728 1726882185.14251: checking for any_errors_fatal 11728 1726882185.14261: done checking for any_errors_fatal 11728 1726882185.14262: checking for max_fail_percentage 11728 1726882185.14264: done checking for max_fail_percentage 11728 1726882185.14265: checking to see if all hosts have failed and the running result is not ok 11728 1726882185.14266: done checking to see if all hosts have failed 11728 1726882185.14267: getting the remaining hosts for this loop 11728 1726882185.14269: done getting the remaining hosts for this loop 11728 1726882185.14272: getting the next task for host managed_node3 11728 1726882185.14280: done getting next task for host managed_node3 11728 1726882185.14284: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882185.14290: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882185.14306: getting variables 11728 1726882185.14308: in VariableManager get_vars() 11728 1726882185.14344: Calling all_inventory to load vars for managed_node3 11728 1726882185.14347: Calling groups_inventory to load vars for managed_node3 11728 1726882185.14350: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882185.14361: Calling all_plugins_play to load vars for managed_node3 11728 1726882185.14364: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882185.14367: Calling groups_plugins_play to load vars for managed_node3 11728 1726882185.14712: done sending task result for task 12673a56-9f93-5c28-a762-0000000002d6 11728 1726882185.14716: WORKER PROCESS EXITING 11728 1726882185.14745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.14965: done with get_vars() 11728 1726882185.14976: done getting variables 11728 1726882185.15035: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:29:45 -0400 (0:00:00.032) 0:00:10.003 ****** 11728 1726882185.15079: entering _queue_task() for managed_node3/set_fact 11728 1726882185.15348: worker is 1 (out of 1 available) 11728 1726882185.15360: exiting _queue_task() for managed_node3/set_fact 11728 1726882185.15486: done queuing things up, now waiting for results queue to drain 11728 1726882185.15487: waiting for pending results... 11728 1726882185.15645: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882185.15788: in run() - task 12673a56-9f93-5c28-a762-0000000002d7 11728 1726882185.15820: variable 'ansible_search_path' from source: unknown 11728 1726882185.15829: variable 'ansible_search_path' from source: unknown 11728 1726882185.15867: calling self._execute() 11728 1726882185.15955: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.15966: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.15981: variable 'omit' from source: magic vars 11728 1726882185.16421: variable 'ansible_distribution_major_version' from source: facts 11728 1726882185.16436: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882185.16584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882185.16853: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882185.16902: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882185.16998: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882185.17008: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882185.17054: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882185.17082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882185.17120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882185.17150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882185.17239: variable '__network_is_ostree' from source: set_fact 11728 1726882185.17251: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882185.17258: when evaluation is False, skipping this task 11728 1726882185.17265: _execute() done 11728 1726882185.17271: dumping result to json 11728 1726882185.17279: done dumping result, returning 11728 1726882185.17292: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-5c28-a762-0000000002d7] 11728 1726882185.17306: sending task result for task 12673a56-9f93-5c28-a762-0000000002d7 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882185.17662: no more pending results, returning what we have 11728 1726882185.17666: results queue empty 11728 1726882185.17667: checking for any_errors_fatal 11728 1726882185.17670: done checking for any_errors_fatal 11728 1726882185.17671: checking for max_fail_percentage 11728 1726882185.17672: done checking for max_fail_percentage 11728 1726882185.17673: checking to see if all hosts have failed and the running result is not ok 11728 1726882185.17674: done checking to see if all hosts have failed 11728 1726882185.17675: getting the remaining hosts for this loop 11728 1726882185.17676: done getting the remaining hosts for this loop 11728 1726882185.17679: getting the next task for host managed_node3 11728 1726882185.17687: done getting next task for host managed_node3 11728 1726882185.17690: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882185.17698: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882185.17710: getting variables 11728 1726882185.17711: in VariableManager get_vars() 11728 1726882185.17746: Calling all_inventory to load vars for managed_node3 11728 1726882185.17749: Calling groups_inventory to load vars for managed_node3 11728 1726882185.17751: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882185.17757: done sending task result for task 12673a56-9f93-5c28-a762-0000000002d7 11728 1726882185.17760: WORKER PROCESS EXITING 11728 1726882185.17767: Calling all_plugins_play to load vars for managed_node3 11728 1726882185.17769: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882185.17772: Calling groups_plugins_play to load vars for managed_node3 11728 1726882185.17986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882185.18211: done with get_vars() 11728 1726882185.18222: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:29:45 -0400 (0:00:00.032) 0:00:10.035 ****** 11728 1726882185.18319: entering _queue_task() for managed_node3/service_facts 11728 1726882185.18320: Creating lock for service_facts 11728 1726882185.18555: worker is 1 (out of 1 available) 11728 1726882185.18567: exiting _queue_task() for managed_node3/service_facts 11728 1726882185.18577: done queuing things up, now waiting for results queue to drain 11728 1726882185.18578: waiting for pending results... 11728 1726882185.18845: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882185.18999: in run() - task 12673a56-9f93-5c28-a762-0000000002d9 11728 1726882185.19002: variable 'ansible_search_path' from source: unknown 11728 1726882185.19004: variable 'ansible_search_path' from source: unknown 11728 1726882185.19049: calling self._execute() 11728 1726882185.19125: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.19129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.19157: variable 'omit' from source: magic vars 11728 1726882185.19503: variable 'ansible_distribution_major_version' from source: facts 11728 1726882185.19558: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882185.19561: variable 'omit' from source: magic vars 11728 1726882185.19613: variable 'omit' from source: magic vars 11728 1726882185.19648: variable 'omit' from source: magic vars 11728 1726882185.19691: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882185.19735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882185.19784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882185.19787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882185.19798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882185.19834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882185.19842: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.19849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.20027: Set connection var ansible_connection to ssh 11728 1726882185.20030: Set connection var ansible_shell_executable to /bin/sh 11728 1726882185.20032: Set connection var ansible_timeout to 10 11728 1726882185.20034: Set connection var ansible_shell_type to sh 11728 1726882185.20036: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882185.20038: Set connection var ansible_pipelining to False 11728 1726882185.20040: variable 'ansible_shell_executable' from source: unknown 11728 1726882185.20042: variable 'ansible_connection' from source: unknown 11728 1726882185.20045: variable 'ansible_module_compression' from source: unknown 11728 1726882185.20047: variable 'ansible_shell_type' from source: unknown 11728 1726882185.20048: variable 'ansible_shell_executable' from source: unknown 11728 1726882185.20050: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882185.20052: variable 'ansible_pipelining' from source: unknown 11728 1726882185.20059: variable 'ansible_timeout' from source: unknown 11728 1726882185.20068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882185.20254: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882185.20354: variable 'omit' from source: magic vars 11728 1726882185.20358: starting attempt loop 11728 1726882185.20360: running the handler 11728 1726882185.20363: _low_level_execute_command(): starting 11728 1726882185.20365: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882185.21115: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882185.21149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882185.21163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882185.21183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882185.21267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882185.22944: stdout chunk (state=3): >>>/root <<< 11728 1726882185.23079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882185.23100: stdout chunk (state=3): >>><<< 11728 1726882185.23136: stderr chunk (state=3): >>><<< 11728 1726882185.23159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882185.23185: _low_level_execute_command(): starting 11728 1726882185.23268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997 `" && echo ansible-tmp-1726882185.2316585-12256-256195962772997="` echo /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997 `" ) && sleep 0' 11728 1726882185.23982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882185.24028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882185.24032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882185.26003: stdout chunk (state=3): >>>ansible-tmp-1726882185.2316585-12256-256195962772997=/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997 <<< 11728 1726882185.26172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882185.26176: stdout chunk (state=3): >>><<< 11728 1726882185.26178: stderr chunk (state=3): >>><<< 11728 1726882185.26180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882185.2316585-12256-256195962772997=/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882185.26183: variable 'ansible_module_compression' from source: unknown 11728 1726882185.26326: ANSIBALLZ: Using lock for service_facts 11728 1726882185.26506: ANSIBALLZ: Acquiring lock 11728 1726882185.26509: ANSIBALLZ: Lock acquired: 139840767829504 11728 1726882185.26511: ANSIBALLZ: Creating module 11728 1726882185.42054: ANSIBALLZ: Writing module into payload 11728 1726882185.42169: ANSIBALLZ: Writing module 11728 1726882185.42208: ANSIBALLZ: Renaming module 11728 1726882185.42224: ANSIBALLZ: Done creating module 11728 1726882185.42246: variable 'ansible_facts' from source: unknown 11728 1726882185.42341: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py 11728 1726882185.42560: Sending initial data 11728 1726882185.42564: Sent initial data (162 bytes) 11728 1726882185.43424: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882185.43476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882185.43522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882185.43536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882185.43590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882185.45124: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882185.45163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882185.45212: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzkhgd3l1 /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py <<< 11728 1726882185.45217: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py" <<< 11728 1726882185.45257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzkhgd3l1" to remote "/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py" <<< 11728 1726882185.45974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882185.46085: stderr chunk (state=3): >>><<< 11728 1726882185.46088: stdout chunk (state=3): >>><<< 11728 1726882185.46090: done transferring module to remote 11728 1726882185.46096: _low_level_execute_command(): starting 11728 1726882185.46099: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/ /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py && sleep 0' 11728 1726882185.46645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882185.46654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882185.46665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882185.46681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882185.46696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882185.46747: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882185.46803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882185.46817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882185.46836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882185.46915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882185.48711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882185.48715: stdout chunk (state=3): >>><<< 11728 1726882185.48717: stderr chunk (state=3): >>><<< 11728 1726882185.48808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882185.48839: _low_level_execute_command(): starting 11728 1726882185.48842: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/AnsiballZ_service_facts.py && sleep 0' 11728 1726882185.49532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882185.49549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882185.49591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882185.49606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882185.49698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882185.49741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882185.49806: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882186.99876: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 11728 1726882186.99947: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 11728 1726882186.99955: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11728 1726882187.01599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882187.01603: stdout chunk (state=3): >>><<< 11728 1726882187.01605: stderr chunk (state=3): >>><<< 11728 1726882187.01612: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882187.02144: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882187.02154: _low_level_execute_command(): starting 11728 1726882187.02160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882185.2316585-12256-256195962772997/ > /dev/null 2>&1 && sleep 0' 11728 1726882187.02798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882187.02815: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882187.02826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.02840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882187.02850: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882187.02857: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882187.02866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.02879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882187.02885: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882187.02891: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882187.02904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882187.02919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.02994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.03013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.03049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.03091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.04900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.04904: stdout chunk (state=3): >>><<< 11728 1726882187.04906: stderr chunk (state=3): >>><<< 11728 1726882187.04922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882187.04934: handler run complete 11728 1726882187.05300: variable 'ansible_facts' from source: unknown 11728 1726882187.05304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882187.05809: variable 'ansible_facts' from source: unknown 11728 1726882187.07188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882187.07441: attempt loop complete, returning result 11728 1726882187.07451: _execute() done 11728 1726882187.07458: dumping result to json 11728 1726882187.07534: done dumping result, returning 11728 1726882187.07549: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-5c28-a762-0000000002d9] 11728 1726882187.07558: sending task result for task 12673a56-9f93-5c28-a762-0000000002d9 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882187.08652: no more pending results, returning what we have 11728 1726882187.08655: results queue empty 11728 1726882187.08656: checking for any_errors_fatal 11728 1726882187.08659: done checking for any_errors_fatal 11728 1726882187.08660: checking for max_fail_percentage 11728 1726882187.08662: done checking for max_fail_percentage 11728 1726882187.08663: checking to see if all hosts have failed and the running result is not ok 11728 1726882187.08663: done checking to see if all hosts have failed 11728 1726882187.08664: getting the remaining hosts for this loop 11728 1726882187.08665: done getting the remaining hosts for this loop 11728 1726882187.08669: getting the next task for host managed_node3 11728 1726882187.08675: done getting next task for host managed_node3 11728 1726882187.08678: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882187.08684: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882187.08704: getting variables 11728 1726882187.08706: in VariableManager get_vars() 11728 1726882187.08734: Calling all_inventory to load vars for managed_node3 11728 1726882187.08737: Calling groups_inventory to load vars for managed_node3 11728 1726882187.08812: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882187.08822: Calling all_plugins_play to load vars for managed_node3 11728 1726882187.08825: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882187.08828: Calling groups_plugins_play to load vars for managed_node3 11728 1726882187.09261: done sending task result for task 12673a56-9f93-5c28-a762-0000000002d9 11728 1726882187.09264: WORKER PROCESS EXITING 11728 1726882187.09287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882187.09834: done with get_vars() 11728 1726882187.09847: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:29:47 -0400 (0:00:01.916) 0:00:11.951 ****** 11728 1726882187.09954: entering _queue_task() for managed_node3/package_facts 11728 1726882187.09956: Creating lock for package_facts 11728 1726882187.10276: worker is 1 (out of 1 available) 11728 1726882187.10291: exiting _queue_task() for managed_node3/package_facts 11728 1726882187.10308: done queuing things up, now waiting for results queue to drain 11728 1726882187.10310: waiting for pending results... 11728 1726882187.10602: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882187.10753: in run() - task 12673a56-9f93-5c28-a762-0000000002da 11728 1726882187.10798: variable 'ansible_search_path' from source: unknown 11728 1726882187.10808: variable 'ansible_search_path' from source: unknown 11728 1726882187.10847: calling self._execute() 11728 1726882187.10941: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882187.10953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882187.11005: variable 'omit' from source: magic vars 11728 1726882187.11289: variable 'ansible_distribution_major_version' from source: facts 11728 1726882187.11302: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882187.11307: variable 'omit' from source: magic vars 11728 1726882187.11359: variable 'omit' from source: magic vars 11728 1726882187.11380: variable 'omit' from source: magic vars 11728 1726882187.11412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882187.11441: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882187.11456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882187.11469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882187.11479: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882187.11506: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882187.11509: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882187.11512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882187.11577: Set connection var ansible_connection to ssh 11728 1726882187.11585: Set connection var ansible_shell_executable to /bin/sh 11728 1726882187.11590: Set connection var ansible_timeout to 10 11728 1726882187.11594: Set connection var ansible_shell_type to sh 11728 1726882187.11604: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882187.11608: Set connection var ansible_pipelining to False 11728 1726882187.11626: variable 'ansible_shell_executable' from source: unknown 11728 1726882187.11630: variable 'ansible_connection' from source: unknown 11728 1726882187.11633: variable 'ansible_module_compression' from source: unknown 11728 1726882187.11635: variable 'ansible_shell_type' from source: unknown 11728 1726882187.11639: variable 'ansible_shell_executable' from source: unknown 11728 1726882187.11641: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882187.11644: variable 'ansible_pipelining' from source: unknown 11728 1726882187.11647: variable 'ansible_timeout' from source: unknown 11728 1726882187.11649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882187.11786: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882187.11795: variable 'omit' from source: magic vars 11728 1726882187.11802: starting attempt loop 11728 1726882187.11805: running the handler 11728 1726882187.11816: _low_level_execute_command(): starting 11728 1726882187.11824: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882187.12281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.12325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882187.12329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.12332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.12335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.12376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.12379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.12384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.12439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.14079: stdout chunk (state=3): >>>/root <<< 11728 1726882187.14178: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.14203: stderr chunk (state=3): >>><<< 11728 1726882187.14207: stdout chunk (state=3): >>><<< 11728 1726882187.14224: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882187.14234: _low_level_execute_command(): starting 11728 1726882187.14239: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778 `" && echo ansible-tmp-1726882187.1422293-12318-48157721734778="` echo /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778 `" ) && sleep 0' 11728 1726882187.14648: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.14651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882187.14659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882187.14662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882187.14665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.14712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.14719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.14759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.16600: stdout chunk (state=3): >>>ansible-tmp-1726882187.1422293-12318-48157721734778=/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778 <<< 11728 1726882187.16731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.16734: stdout chunk (state=3): >>><<< 11728 1726882187.16740: stderr chunk (state=3): >>><<< 11728 1726882187.16751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882187.1422293-12318-48157721734778=/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882187.16789: variable 'ansible_module_compression' from source: unknown 11728 1726882187.16828: ANSIBALLZ: Using lock for package_facts 11728 1726882187.16832: ANSIBALLZ: Acquiring lock 11728 1726882187.16834: ANSIBALLZ: Lock acquired: 139840765874720 11728 1726882187.16837: ANSIBALLZ: Creating module 11728 1726882187.38669: ANSIBALLZ: Writing module into payload 11728 1726882187.38901: ANSIBALLZ: Writing module 11728 1726882187.38904: ANSIBALLZ: Renaming module 11728 1726882187.38907: ANSIBALLZ: Done creating module 11728 1726882187.38909: variable 'ansible_facts' from source: unknown 11728 1726882187.39084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py 11728 1726882187.39330: Sending initial data 11728 1726882187.39333: Sent initial data (161 bytes) 11728 1726882187.39888: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882187.39907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882187.39921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.39966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.40043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.40059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.40086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.40178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.41836: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882187.41894: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882187.41956: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvijedutu /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py <<< 11728 1726882187.41959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py" <<< 11728 1726882187.42006: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvijedutu" to remote "/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py" <<< 11728 1726882187.44065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.44069: stdout chunk (state=3): >>><<< 11728 1726882187.44071: stderr chunk (state=3): >>><<< 11728 1726882187.44073: done transferring module to remote 11728 1726882187.44075: _low_level_execute_command(): starting 11728 1726882187.44077: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/ /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py && sleep 0' 11728 1726882187.44666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.44689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.44715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.44777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.46603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.46615: stdout chunk (state=3): >>><<< 11728 1726882187.46627: stderr chunk (state=3): >>><<< 11728 1726882187.46647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882187.46656: _low_level_execute_command(): starting 11728 1726882187.46666: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/AnsiballZ_package_facts.py && sleep 0' 11728 1726882187.47266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882187.47279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882187.47297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.47317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882187.47415: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.47431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.47447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.47469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.47552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.91304: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11728 1726882187.91396: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11728 1726882187.91471: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11728 1726882187.91482: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11728 1726882187.91567: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11728 1726882187.93278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882187.93306: stderr chunk (state=3): >>><<< 11728 1726882187.93310: stdout chunk (state=3): >>><<< 11728 1726882187.93342: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882187.95082: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882187.95086: _low_level_execute_command(): starting 11728 1726882187.95299: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882187.1422293-12318-48157721734778/ > /dev/null 2>&1 && sleep 0' 11728 1726882187.95603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882187.95618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882187.95637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882187.95656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882187.95673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882187.95716: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882187.95734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882187.95809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882187.95846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882187.95861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882187.95925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882187.97850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882187.97866: stdout chunk (state=3): >>><<< 11728 1726882187.97879: stderr chunk (state=3): >>><<< 11728 1726882187.97907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882187.97920: handler run complete 11728 1726882187.99862: variable 'ansible_facts' from source: unknown 11728 1726882188.00271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.02297: variable 'ansible_facts' from source: unknown 11728 1726882188.02749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.03484: attempt loop complete, returning result 11728 1726882188.03496: _execute() done 11728 1726882188.03500: dumping result to json 11728 1726882188.03710: done dumping result, returning 11728 1726882188.03721: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-5c28-a762-0000000002da] 11728 1726882188.03724: sending task result for task 12673a56-9f93-5c28-a762-0000000002da 11728 1726882188.10301: done sending task result for task 12673a56-9f93-5c28-a762-0000000002da 11728 1726882188.10305: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882188.10402: no more pending results, returning what we have 11728 1726882188.10405: results queue empty 11728 1726882188.10406: checking for any_errors_fatal 11728 1726882188.10409: done checking for any_errors_fatal 11728 1726882188.10410: checking for max_fail_percentage 11728 1726882188.10411: done checking for max_fail_percentage 11728 1726882188.10412: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.10413: done checking to see if all hosts have failed 11728 1726882188.10413: getting the remaining hosts for this loop 11728 1726882188.10414: done getting the remaining hosts for this loop 11728 1726882188.10418: getting the next task for host managed_node3 11728 1726882188.10425: done getting next task for host managed_node3 11728 1726882188.10428: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882188.10433: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.10442: getting variables 11728 1726882188.10443: in VariableManager get_vars() 11728 1726882188.10467: Calling all_inventory to load vars for managed_node3 11728 1726882188.10470: Calling groups_inventory to load vars for managed_node3 11728 1726882188.10477: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.10485: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.10488: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.10490: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.11680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.13310: done with get_vars() 11728 1726882188.13335: done getting variables 11728 1726882188.13399: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:29:48 -0400 (0:00:01.034) 0:00:12.986 ****** 11728 1726882188.13436: entering _queue_task() for managed_node3/debug 11728 1726882188.13921: worker is 1 (out of 1 available) 11728 1726882188.13930: exiting _queue_task() for managed_node3/debug 11728 1726882188.13941: done queuing things up, now waiting for results queue to drain 11728 1726882188.13943: waiting for pending results... 11728 1726882188.14074: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882188.14281: in run() - task 12673a56-9f93-5c28-a762-000000000278 11728 1726882188.14285: variable 'ansible_search_path' from source: unknown 11728 1726882188.14287: variable 'ansible_search_path' from source: unknown 11728 1726882188.14290: calling self._execute() 11728 1726882188.14364: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.14377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.14402: variable 'omit' from source: magic vars 11728 1726882188.14775: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.14789: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.14802: variable 'omit' from source: magic vars 11728 1726882188.14863: variable 'omit' from source: magic vars 11728 1726882188.14954: variable 'network_provider' from source: set_fact 11728 1726882188.15044: variable 'omit' from source: magic vars 11728 1726882188.15048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882188.15073: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882188.15101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882188.15125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882188.15143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882188.15188: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882188.15199: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.15208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.15322: Set connection var ansible_connection to ssh 11728 1726882188.15339: Set connection var ansible_shell_executable to /bin/sh 11728 1726882188.15349: Set connection var ansible_timeout to 10 11728 1726882188.15371: Set connection var ansible_shell_type to sh 11728 1726882188.15374: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882188.15383: Set connection var ansible_pipelining to False 11728 1726882188.15482: variable 'ansible_shell_executable' from source: unknown 11728 1726882188.15486: variable 'ansible_connection' from source: unknown 11728 1726882188.15488: variable 'ansible_module_compression' from source: unknown 11728 1726882188.15491: variable 'ansible_shell_type' from source: unknown 11728 1726882188.15495: variable 'ansible_shell_executable' from source: unknown 11728 1726882188.15497: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.15499: variable 'ansible_pipelining' from source: unknown 11728 1726882188.15501: variable 'ansible_timeout' from source: unknown 11728 1726882188.15503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.15621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882188.15638: variable 'omit' from source: magic vars 11728 1726882188.15648: starting attempt loop 11728 1726882188.15654: running the handler 11728 1726882188.15710: handler run complete 11728 1726882188.15729: attempt loop complete, returning result 11728 1726882188.15737: _execute() done 11728 1726882188.15744: dumping result to json 11728 1726882188.15798: done dumping result, returning 11728 1726882188.15808: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-5c28-a762-000000000278] 11728 1726882188.15812: sending task result for task 12673a56-9f93-5c28-a762-000000000278 11728 1726882188.15886: done sending task result for task 12673a56-9f93-5c28-a762-000000000278 ok: [managed_node3] => {} MSG: Using network provider: nm 11728 1726882188.15977: no more pending results, returning what we have 11728 1726882188.15981: results queue empty 11728 1726882188.15982: checking for any_errors_fatal 11728 1726882188.15995: done checking for any_errors_fatal 11728 1726882188.15996: checking for max_fail_percentage 11728 1726882188.15998: done checking for max_fail_percentage 11728 1726882188.15999: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.16000: done checking to see if all hosts have failed 11728 1726882188.16001: getting the remaining hosts for this loop 11728 1726882188.16002: done getting the remaining hosts for this loop 11728 1726882188.16007: getting the next task for host managed_node3 11728 1726882188.16016: done getting next task for host managed_node3 11728 1726882188.16134: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882188.16141: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.16153: getting variables 11728 1726882188.16154: in VariableManager get_vars() 11728 1726882188.16192: Calling all_inventory to load vars for managed_node3 11728 1726882188.16246: Calling groups_inventory to load vars for managed_node3 11728 1726882188.16249: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.16255: WORKER PROCESS EXITING 11728 1726882188.16265: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.16269: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.16272: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.17807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.18751: done with get_vars() 11728 1726882188.18767: done getting variables 11728 1726882188.18835: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:29:48 -0400 (0:00:00.054) 0:00:13.041 ****** 11728 1726882188.18864: entering _queue_task() for managed_node3/fail 11728 1726882188.18865: Creating lock for fail 11728 1726882188.19078: worker is 1 (out of 1 available) 11728 1726882188.19092: exiting _queue_task() for managed_node3/fail 11728 1726882188.19106: done queuing things up, now waiting for results queue to drain 11728 1726882188.19107: waiting for pending results... 11728 1726882188.19278: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882188.19368: in run() - task 12673a56-9f93-5c28-a762-000000000279 11728 1726882188.19379: variable 'ansible_search_path' from source: unknown 11728 1726882188.19382: variable 'ansible_search_path' from source: unknown 11728 1726882188.19422: calling self._execute() 11728 1726882188.19483: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.19487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.19496: variable 'omit' from source: magic vars 11728 1726882188.19909: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.19913: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.20102: variable 'network_state' from source: role '' defaults 11728 1726882188.20106: Evaluated conditional (network_state != {}): False 11728 1726882188.20108: when evaluation is False, skipping this task 11728 1726882188.20111: _execute() done 11728 1726882188.20113: dumping result to json 11728 1726882188.20115: done dumping result, returning 11728 1726882188.20117: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-5c28-a762-000000000279] 11728 1726882188.20119: sending task result for task 12673a56-9f93-5c28-a762-000000000279 11728 1726882188.20183: done sending task result for task 12673a56-9f93-5c28-a762-000000000279 11728 1726882188.20186: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882188.20235: no more pending results, returning what we have 11728 1726882188.20239: results queue empty 11728 1726882188.20240: checking for any_errors_fatal 11728 1726882188.20246: done checking for any_errors_fatal 11728 1726882188.20247: checking for max_fail_percentage 11728 1726882188.20248: done checking for max_fail_percentage 11728 1726882188.20249: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.20250: done checking to see if all hosts have failed 11728 1726882188.20251: getting the remaining hosts for this loop 11728 1726882188.20252: done getting the remaining hosts for this loop 11728 1726882188.20256: getting the next task for host managed_node3 11728 1726882188.20263: done getting next task for host managed_node3 11728 1726882188.20267: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882188.20272: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.20287: getting variables 11728 1726882188.20288: in VariableManager get_vars() 11728 1726882188.20325: Calling all_inventory to load vars for managed_node3 11728 1726882188.20327: Calling groups_inventory to load vars for managed_node3 11728 1726882188.20330: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.20341: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.20344: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.20347: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.21515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.22361: done with get_vars() 11728 1726882188.22375: done getting variables 11728 1726882188.22417: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:29:48 -0400 (0:00:00.035) 0:00:13.076 ****** 11728 1726882188.22441: entering _queue_task() for managed_node3/fail 11728 1726882188.22632: worker is 1 (out of 1 available) 11728 1726882188.22646: exiting _queue_task() for managed_node3/fail 11728 1726882188.22659: done queuing things up, now waiting for results queue to drain 11728 1726882188.22660: waiting for pending results... 11728 1726882188.22853: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882188.23201: in run() - task 12673a56-9f93-5c28-a762-00000000027a 11728 1726882188.23205: variable 'ansible_search_path' from source: unknown 11728 1726882188.23212: variable 'ansible_search_path' from source: unknown 11728 1726882188.23215: calling self._execute() 11728 1726882188.23218: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.23222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.23224: variable 'omit' from source: magic vars 11728 1726882188.23520: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.23541: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.23664: variable 'network_state' from source: role '' defaults 11728 1726882188.23674: Evaluated conditional (network_state != {}): False 11728 1726882188.23677: when evaluation is False, skipping this task 11728 1726882188.23679: _execute() done 11728 1726882188.23682: dumping result to json 11728 1726882188.23685: done dumping result, returning 11728 1726882188.23697: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-5c28-a762-00000000027a] 11728 1726882188.23700: sending task result for task 12673a56-9f93-5c28-a762-00000000027a 11728 1726882188.23789: done sending task result for task 12673a56-9f93-5c28-a762-00000000027a 11728 1726882188.23800: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882188.23843: no more pending results, returning what we have 11728 1726882188.23847: results queue empty 11728 1726882188.23848: checking for any_errors_fatal 11728 1726882188.23854: done checking for any_errors_fatal 11728 1726882188.23855: checking for max_fail_percentage 11728 1726882188.23856: done checking for max_fail_percentage 11728 1726882188.23857: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.23858: done checking to see if all hosts have failed 11728 1726882188.23858: getting the remaining hosts for this loop 11728 1726882188.23860: done getting the remaining hosts for this loop 11728 1726882188.23864: getting the next task for host managed_node3 11728 1726882188.23870: done getting next task for host managed_node3 11728 1726882188.23873: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882188.23878: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.23890: getting variables 11728 1726882188.23891: in VariableManager get_vars() 11728 1726882188.23923: Calling all_inventory to load vars for managed_node3 11728 1726882188.23925: Calling groups_inventory to load vars for managed_node3 11728 1726882188.23927: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.23935: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.23937: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.23939: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.24943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.25788: done with get_vars() 11728 1726882188.25806: done getting variables 11728 1726882188.25849: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:29:48 -0400 (0:00:00.034) 0:00:13.111 ****** 11728 1726882188.25873: entering _queue_task() for managed_node3/fail 11728 1726882188.26122: worker is 1 (out of 1 available) 11728 1726882188.26135: exiting _queue_task() for managed_node3/fail 11728 1726882188.26147: done queuing things up, now waiting for results queue to drain 11728 1726882188.26148: waiting for pending results... 11728 1726882188.26392: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882188.26602: in run() - task 12673a56-9f93-5c28-a762-00000000027b 11728 1726882188.26609: variable 'ansible_search_path' from source: unknown 11728 1726882188.26613: variable 'ansible_search_path' from source: unknown 11728 1726882188.26616: calling self._execute() 11728 1726882188.26684: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.26698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.26732: variable 'omit' from source: magic vars 11728 1726882188.27099: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.27200: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.27283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882188.28988: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882188.29045: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882188.29073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882188.29106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882188.29124: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882188.29180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.29202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.29224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.29251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.29262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.29332: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.29343: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11728 1726882188.29420: variable 'ansible_distribution' from source: facts 11728 1726882188.29424: variable '__network_rh_distros' from source: role '' defaults 11728 1726882188.29433: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11728 1726882188.29585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.29606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.29623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.29649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.29662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.29697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.29712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.29729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.29753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.29774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.29998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.30002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.30004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.30006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.30008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.30240: variable 'network_connections' from source: include params 11728 1726882188.30258: variable 'controller_profile' from source: play vars 11728 1726882188.30325: variable 'controller_profile' from source: play vars 11728 1726882188.30340: variable 'controller_device' from source: play vars 11728 1726882188.30402: variable 'controller_device' from source: play vars 11728 1726882188.30420: variable 'port1_profile' from source: play vars 11728 1726882188.30481: variable 'port1_profile' from source: play vars 11728 1726882188.30495: variable 'dhcp_interface1' from source: play vars 11728 1726882188.30583: variable 'dhcp_interface1' from source: play vars 11728 1726882188.30609: variable 'controller_profile' from source: play vars 11728 1726882188.30700: variable 'controller_profile' from source: play vars 11728 1726882188.30713: variable 'port2_profile' from source: play vars 11728 1726882188.30789: variable 'port2_profile' from source: play vars 11728 1726882188.30801: variable 'dhcp_interface2' from source: play vars 11728 1726882188.30861: variable 'dhcp_interface2' from source: play vars 11728 1726882188.30867: variable 'controller_profile' from source: play vars 11728 1726882188.30922: variable 'controller_profile' from source: play vars 11728 1726882188.30930: variable 'network_state' from source: role '' defaults 11728 1726882188.30973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882188.31104: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882188.31130: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882188.31153: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882188.31175: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882188.31225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882188.31241: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882188.31259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.31276: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882188.31313: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11728 1726882188.31316: when evaluation is False, skipping this task 11728 1726882188.31319: _execute() done 11728 1726882188.31321: dumping result to json 11728 1726882188.31323: done dumping result, returning 11728 1726882188.31328: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-5c28-a762-00000000027b] 11728 1726882188.31332: sending task result for task 12673a56-9f93-5c28-a762-00000000027b 11728 1726882188.31414: done sending task result for task 12673a56-9f93-5c28-a762-00000000027b 11728 1726882188.31417: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11728 1726882188.31474: no more pending results, returning what we have 11728 1726882188.31478: results queue empty 11728 1726882188.31479: checking for any_errors_fatal 11728 1726882188.31485: done checking for any_errors_fatal 11728 1726882188.31486: checking for max_fail_percentage 11728 1726882188.31488: done checking for max_fail_percentage 11728 1726882188.31489: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.31489: done checking to see if all hosts have failed 11728 1726882188.31490: getting the remaining hosts for this loop 11728 1726882188.31492: done getting the remaining hosts for this loop 11728 1726882188.31497: getting the next task for host managed_node3 11728 1726882188.31504: done getting next task for host managed_node3 11728 1726882188.31507: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882188.31512: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.31527: getting variables 11728 1726882188.31529: in VariableManager get_vars() 11728 1726882188.31562: Calling all_inventory to load vars for managed_node3 11728 1726882188.31565: Calling groups_inventory to load vars for managed_node3 11728 1726882188.31567: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.31575: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.31577: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.31579: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.32337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.33269: done with get_vars() 11728 1726882188.33285: done getting variables 11728 1726882188.33356: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:29:48 -0400 (0:00:00.075) 0:00:13.186 ****** 11728 1726882188.33379: entering _queue_task() for managed_node3/dnf 11728 1726882188.33587: worker is 1 (out of 1 available) 11728 1726882188.33604: exiting _queue_task() for managed_node3/dnf 11728 1726882188.33617: done queuing things up, now waiting for results queue to drain 11728 1726882188.33618: waiting for pending results... 11728 1726882188.33777: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882188.33859: in run() - task 12673a56-9f93-5c28-a762-00000000027c 11728 1726882188.33871: variable 'ansible_search_path' from source: unknown 11728 1726882188.33875: variable 'ansible_search_path' from source: unknown 11728 1726882188.33905: calling self._execute() 11728 1726882188.33968: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.33971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.33980: variable 'omit' from source: magic vars 11728 1726882188.34235: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.34244: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.34373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882188.35821: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882188.35870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882188.35897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882188.35926: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882188.35946: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882188.36004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.36029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.36046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.36071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.36084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.36164: variable 'ansible_distribution' from source: facts 11728 1726882188.36168: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.36179: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11728 1726882188.36255: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882188.36338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.36357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.36374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.36402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.36415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.36443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.36458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.36480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.36509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.36519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.36546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.36563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.36581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.36609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.36619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.36718: variable 'network_connections' from source: include params 11728 1726882188.36727: variable 'controller_profile' from source: play vars 11728 1726882188.36769: variable 'controller_profile' from source: play vars 11728 1726882188.36777: variable 'controller_device' from source: play vars 11728 1726882188.36825: variable 'controller_device' from source: play vars 11728 1726882188.36835: variable 'port1_profile' from source: play vars 11728 1726882188.36876: variable 'port1_profile' from source: play vars 11728 1726882188.36883: variable 'dhcp_interface1' from source: play vars 11728 1726882188.36930: variable 'dhcp_interface1' from source: play vars 11728 1726882188.36934: variable 'controller_profile' from source: play vars 11728 1726882188.36976: variable 'controller_profile' from source: play vars 11728 1726882188.36981: variable 'port2_profile' from source: play vars 11728 1726882188.37028: variable 'port2_profile' from source: play vars 11728 1726882188.37034: variable 'dhcp_interface2' from source: play vars 11728 1726882188.37074: variable 'dhcp_interface2' from source: play vars 11728 1726882188.37080: variable 'controller_profile' from source: play vars 11728 1726882188.37127: variable 'controller_profile' from source: play vars 11728 1726882188.37174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882188.37284: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882188.37317: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882188.37340: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882188.37362: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882188.37392: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882188.37423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882188.37445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.37462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882188.37508: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882188.37656: variable 'network_connections' from source: include params 11728 1726882188.37659: variable 'controller_profile' from source: play vars 11728 1726882188.37705: variable 'controller_profile' from source: play vars 11728 1726882188.37710: variable 'controller_device' from source: play vars 11728 1726882188.37751: variable 'controller_device' from source: play vars 11728 1726882188.37761: variable 'port1_profile' from source: play vars 11728 1726882188.37807: variable 'port1_profile' from source: play vars 11728 1726882188.37813: variable 'dhcp_interface1' from source: play vars 11728 1726882188.37853: variable 'dhcp_interface1' from source: play vars 11728 1726882188.37858: variable 'controller_profile' from source: play vars 11728 1726882188.37905: variable 'controller_profile' from source: play vars 11728 1726882188.37911: variable 'port2_profile' from source: play vars 11728 1726882188.37952: variable 'port2_profile' from source: play vars 11728 1726882188.37958: variable 'dhcp_interface2' from source: play vars 11728 1726882188.38004: variable 'dhcp_interface2' from source: play vars 11728 1726882188.38010: variable 'controller_profile' from source: play vars 11728 1726882188.38051: variable 'controller_profile' from source: play vars 11728 1726882188.38074: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882188.38079: when evaluation is False, skipping this task 11728 1726882188.38081: _execute() done 11728 1726882188.38084: dumping result to json 11728 1726882188.38086: done dumping result, returning 11728 1726882188.38096: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-00000000027c] 11728 1726882188.38104: sending task result for task 12673a56-9f93-5c28-a762-00000000027c 11728 1726882188.38180: done sending task result for task 12673a56-9f93-5c28-a762-00000000027c 11728 1726882188.38182: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882188.38242: no more pending results, returning what we have 11728 1726882188.38245: results queue empty 11728 1726882188.38246: checking for any_errors_fatal 11728 1726882188.38253: done checking for any_errors_fatal 11728 1726882188.38253: checking for max_fail_percentage 11728 1726882188.38255: done checking for max_fail_percentage 11728 1726882188.38256: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.38257: done checking to see if all hosts have failed 11728 1726882188.38257: getting the remaining hosts for this loop 11728 1726882188.38259: done getting the remaining hosts for this loop 11728 1726882188.38262: getting the next task for host managed_node3 11728 1726882188.38269: done getting next task for host managed_node3 11728 1726882188.38273: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882188.38278: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.38292: getting variables 11728 1726882188.38295: in VariableManager get_vars() 11728 1726882188.38326: Calling all_inventory to load vars for managed_node3 11728 1726882188.38329: Calling groups_inventory to load vars for managed_node3 11728 1726882188.38331: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.38338: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.38340: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.38342: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.39090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.39943: done with get_vars() 11728 1726882188.39958: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882188.40010: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:29:48 -0400 (0:00:00.066) 0:00:13.252 ****** 11728 1726882188.40035: entering _queue_task() for managed_node3/yum 11728 1726882188.40036: Creating lock for yum 11728 1726882188.40241: worker is 1 (out of 1 available) 11728 1726882188.40253: exiting _queue_task() for managed_node3/yum 11728 1726882188.40265: done queuing things up, now waiting for results queue to drain 11728 1726882188.40266: waiting for pending results... 11728 1726882188.40432: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882188.40509: in run() - task 12673a56-9f93-5c28-a762-00000000027d 11728 1726882188.40520: variable 'ansible_search_path' from source: unknown 11728 1726882188.40524: variable 'ansible_search_path' from source: unknown 11728 1726882188.40551: calling self._execute() 11728 1726882188.40615: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.40619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.40627: variable 'omit' from source: magic vars 11728 1726882188.40878: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.40887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.41010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882188.42476: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882188.42530: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882188.42561: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882188.42584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882188.42607: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882188.42664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.42687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.42708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.42735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.42745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.42812: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.42825: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11728 1726882188.42828: when evaluation is False, skipping this task 11728 1726882188.42831: _execute() done 11728 1726882188.42833: dumping result to json 11728 1726882188.42836: done dumping result, returning 11728 1726882188.42843: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-00000000027d] 11728 1726882188.42848: sending task result for task 12673a56-9f93-5c28-a762-00000000027d skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11728 1726882188.42981: no more pending results, returning what we have 11728 1726882188.42984: results queue empty 11728 1726882188.42985: checking for any_errors_fatal 11728 1726882188.42992: done checking for any_errors_fatal 11728 1726882188.42992: checking for max_fail_percentage 11728 1726882188.42996: done checking for max_fail_percentage 11728 1726882188.42997: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.42997: done checking to see if all hosts have failed 11728 1726882188.42998: getting the remaining hosts for this loop 11728 1726882188.43000: done getting the remaining hosts for this loop 11728 1726882188.43004: getting the next task for host managed_node3 11728 1726882188.43011: done getting next task for host managed_node3 11728 1726882188.43015: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882188.43020: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.43033: getting variables 11728 1726882188.43034: in VariableManager get_vars() 11728 1726882188.43062: Calling all_inventory to load vars for managed_node3 11728 1726882188.43064: Calling groups_inventory to load vars for managed_node3 11728 1726882188.43066: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.43073: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.43075: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.43078: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.43606: done sending task result for task 12673a56-9f93-5c28-a762-00000000027d 11728 1726882188.43610: WORKER PROCESS EXITING 11728 1726882188.43935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.44768: done with get_vars() 11728 1726882188.44781: done getting variables 11728 1726882188.44823: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:29:48 -0400 (0:00:00.048) 0:00:13.300 ****** 11728 1726882188.44848: entering _queue_task() for managed_node3/fail 11728 1726882188.45042: worker is 1 (out of 1 available) 11728 1726882188.45055: exiting _queue_task() for managed_node3/fail 11728 1726882188.45066: done queuing things up, now waiting for results queue to drain 11728 1726882188.45067: waiting for pending results... 11728 1726882188.45235: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882188.45329: in run() - task 12673a56-9f93-5c28-a762-00000000027e 11728 1726882188.45340: variable 'ansible_search_path' from source: unknown 11728 1726882188.45344: variable 'ansible_search_path' from source: unknown 11728 1726882188.45370: calling self._execute() 11728 1726882188.45434: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.45438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.45445: variable 'omit' from source: magic vars 11728 1726882188.45718: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.45727: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.45808: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882188.45935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882188.47800: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882188.47805: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882188.47808: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882188.47810: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882188.47812: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882188.47875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.47910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.47937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.47970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.47981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.48033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.48054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.48071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.48098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.48115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.48143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.48165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.48184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.48213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.48223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.48343: variable 'network_connections' from source: include params 11728 1726882188.48353: variable 'controller_profile' from source: play vars 11728 1726882188.48408: variable 'controller_profile' from source: play vars 11728 1726882188.48417: variable 'controller_device' from source: play vars 11728 1726882188.48462: variable 'controller_device' from source: play vars 11728 1726882188.48474: variable 'port1_profile' from source: play vars 11728 1726882188.48522: variable 'port1_profile' from source: play vars 11728 1726882188.48528: variable 'dhcp_interface1' from source: play vars 11728 1726882188.48571: variable 'dhcp_interface1' from source: play vars 11728 1726882188.48575: variable 'controller_profile' from source: play vars 11728 1726882188.48623: variable 'controller_profile' from source: play vars 11728 1726882188.48629: variable 'port2_profile' from source: play vars 11728 1726882188.48670: variable 'port2_profile' from source: play vars 11728 1726882188.48676: variable 'dhcp_interface2' from source: play vars 11728 1726882188.48723: variable 'dhcp_interface2' from source: play vars 11728 1726882188.48730: variable 'controller_profile' from source: play vars 11728 1726882188.48771: variable 'controller_profile' from source: play vars 11728 1726882188.48825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882188.48942: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882188.48968: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882188.48990: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882188.49016: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882188.49050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882188.49065: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882188.49082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.49103: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882188.49155: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882188.49322: variable 'network_connections' from source: include params 11728 1726882188.49325: variable 'controller_profile' from source: play vars 11728 1726882188.49370: variable 'controller_profile' from source: play vars 11728 1726882188.49376: variable 'controller_device' from source: play vars 11728 1726882188.49420: variable 'controller_device' from source: play vars 11728 1726882188.49429: variable 'port1_profile' from source: play vars 11728 1726882188.49471: variable 'port1_profile' from source: play vars 11728 1726882188.49480: variable 'dhcp_interface1' from source: play vars 11728 1726882188.49524: variable 'dhcp_interface1' from source: play vars 11728 1726882188.49528: variable 'controller_profile' from source: play vars 11728 1726882188.49571: variable 'controller_profile' from source: play vars 11728 1726882188.49580: variable 'port2_profile' from source: play vars 11728 1726882188.49623: variable 'port2_profile' from source: play vars 11728 1726882188.49629: variable 'dhcp_interface2' from source: play vars 11728 1726882188.49699: variable 'dhcp_interface2' from source: play vars 11728 1726882188.49702: variable 'controller_profile' from source: play vars 11728 1726882188.49899: variable 'controller_profile' from source: play vars 11728 1726882188.49902: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882188.49904: when evaluation is False, skipping this task 11728 1726882188.49907: _execute() done 11728 1726882188.49909: dumping result to json 11728 1726882188.49911: done dumping result, returning 11728 1726882188.49913: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-00000000027e] 11728 1726882188.49914: sending task result for task 12673a56-9f93-5c28-a762-00000000027e 11728 1726882188.49977: done sending task result for task 12673a56-9f93-5c28-a762-00000000027e 11728 1726882188.49980: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882188.50056: no more pending results, returning what we have 11728 1726882188.50060: results queue empty 11728 1726882188.50061: checking for any_errors_fatal 11728 1726882188.50067: done checking for any_errors_fatal 11728 1726882188.50067: checking for max_fail_percentage 11728 1726882188.50069: done checking for max_fail_percentage 11728 1726882188.50070: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.50071: done checking to see if all hosts have failed 11728 1726882188.50071: getting the remaining hosts for this loop 11728 1726882188.50073: done getting the remaining hosts for this loop 11728 1726882188.50077: getting the next task for host managed_node3 11728 1726882188.50084: done getting next task for host managed_node3 11728 1726882188.50088: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11728 1726882188.50097: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.50113: getting variables 11728 1726882188.50114: in VariableManager get_vars() 11728 1726882188.50149: Calling all_inventory to load vars for managed_node3 11728 1726882188.50152: Calling groups_inventory to load vars for managed_node3 11728 1726882188.50154: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.50163: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.50166: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.50168: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.51800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.53459: done with get_vars() 11728 1726882188.53481: done getting variables 11728 1726882188.53546: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:29:48 -0400 (0:00:00.087) 0:00:13.388 ****** 11728 1726882188.53580: entering _queue_task() for managed_node3/package 11728 1726882188.53979: worker is 1 (out of 1 available) 11728 1726882188.53990: exiting _queue_task() for managed_node3/package 11728 1726882188.54006: done queuing things up, now waiting for results queue to drain 11728 1726882188.54007: waiting for pending results... 11728 1726882188.54324: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11728 1726882188.54441: in run() - task 12673a56-9f93-5c28-a762-00000000027f 11728 1726882188.54462: variable 'ansible_search_path' from source: unknown 11728 1726882188.54489: variable 'ansible_search_path' from source: unknown 11728 1726882188.54519: calling self._execute() 11728 1726882188.54711: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.54715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.54718: variable 'omit' from source: magic vars 11728 1726882188.55060: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.55081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.55300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882188.55606: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882188.55652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882188.55701: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882188.55738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882188.55853: variable 'network_packages' from source: role '' defaults 11728 1726882188.55968: variable '__network_provider_setup' from source: role '' defaults 11728 1726882188.55983: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882188.56130: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882188.56133: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882188.56142: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882188.56332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882188.64003: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882188.64007: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882188.64009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882188.64224: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882188.64267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882188.64451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.64482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.64513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.64617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.64636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.64686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.64784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.64816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.64981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.65006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.65458: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882188.65663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.65726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.65758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.65884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.65907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.66107: variable 'ansible_python' from source: facts 11728 1726882188.66278: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882188.66281: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882188.66460: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882188.66735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.66831: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.66860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.67201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.67205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.67208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882188.67217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882188.67246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.67287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882188.67322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882188.67589: variable 'network_connections' from source: include params 11728 1726882188.67708: variable 'controller_profile' from source: play vars 11728 1726882188.67870: variable 'controller_profile' from source: play vars 11728 1726882188.67886: variable 'controller_device' from source: play vars 11728 1726882188.68112: variable 'controller_device' from source: play vars 11728 1726882188.68133: variable 'port1_profile' from source: play vars 11728 1726882188.68400: variable 'port1_profile' from source: play vars 11728 1726882188.68407: variable 'dhcp_interface1' from source: play vars 11728 1726882188.68612: variable 'dhcp_interface1' from source: play vars 11728 1726882188.68628: variable 'controller_profile' from source: play vars 11728 1726882188.68844: variable 'controller_profile' from source: play vars 11728 1726882188.68859: variable 'port2_profile' from source: play vars 11728 1726882188.69063: variable 'port2_profile' from source: play vars 11728 1726882188.69078: variable 'dhcp_interface2' from source: play vars 11728 1726882188.69259: variable 'dhcp_interface2' from source: play vars 11728 1726882188.69400: variable 'controller_profile' from source: play vars 11728 1726882188.69699: variable 'controller_profile' from source: play vars 11728 1726882188.69704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882188.69928: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882188.69931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882188.69933: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882188.69936: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882188.70562: variable 'network_connections' from source: include params 11728 1726882188.70611: variable 'controller_profile' from source: play vars 11728 1726882188.70842: variable 'controller_profile' from source: play vars 11728 1726882188.70857: variable 'controller_device' from source: play vars 11728 1726882188.71028: variable 'controller_device' from source: play vars 11728 1726882188.71200: variable 'port1_profile' from source: play vars 11728 1726882188.71275: variable 'port1_profile' from source: play vars 11728 1726882188.71480: variable 'dhcp_interface1' from source: play vars 11728 1726882188.71567: variable 'dhcp_interface1' from source: play vars 11728 1726882188.71583: variable 'controller_profile' from source: play vars 11728 1726882188.71696: variable 'controller_profile' from source: play vars 11728 1726882188.71789: variable 'port2_profile' from source: play vars 11728 1726882188.71909: variable 'port2_profile' from source: play vars 11728 1726882188.71928: variable 'dhcp_interface2' from source: play vars 11728 1726882188.72061: variable 'dhcp_interface2' from source: play vars 11728 1726882188.72075: variable 'controller_profile' from source: play vars 11728 1726882188.72185: variable 'controller_profile' from source: play vars 11728 1726882188.72261: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882188.72348: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882188.72689: variable 'network_connections' from source: include params 11728 1726882188.72703: variable 'controller_profile' from source: play vars 11728 1726882188.72772: variable 'controller_profile' from source: play vars 11728 1726882188.72800: variable 'controller_device' from source: play vars 11728 1726882188.72907: variable 'controller_device' from source: play vars 11728 1726882188.72913: variable 'port1_profile' from source: play vars 11728 1726882188.72961: variable 'port1_profile' from source: play vars 11728 1726882188.72974: variable 'dhcp_interface1' from source: play vars 11728 1726882188.73053: variable 'dhcp_interface1' from source: play vars 11728 1726882188.73066: variable 'controller_profile' from source: play vars 11728 1726882188.73147: variable 'controller_profile' from source: play vars 11728 1726882188.73201: variable 'port2_profile' from source: play vars 11728 1726882188.73238: variable 'port2_profile' from source: play vars 11728 1726882188.73255: variable 'dhcp_interface2' from source: play vars 11728 1726882188.73325: variable 'dhcp_interface2' from source: play vars 11728 1726882188.73342: variable 'controller_profile' from source: play vars 11728 1726882188.73415: variable 'controller_profile' from source: play vars 11728 1726882188.73452: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882188.73557: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882188.73900: variable 'network_connections' from source: include params 11728 1726882188.73996: variable 'controller_profile' from source: play vars 11728 1726882188.74000: variable 'controller_profile' from source: play vars 11728 1726882188.74007: variable 'controller_device' from source: play vars 11728 1726882188.74063: variable 'controller_device' from source: play vars 11728 1726882188.74080: variable 'port1_profile' from source: play vars 11728 1726882188.74344: variable 'port1_profile' from source: play vars 11728 1726882188.74500: variable 'dhcp_interface1' from source: play vars 11728 1726882188.74503: variable 'dhcp_interface1' from source: play vars 11728 1726882188.74505: variable 'controller_profile' from source: play vars 11728 1726882188.74589: variable 'controller_profile' from source: play vars 11728 1726882188.74655: variable 'port2_profile' from source: play vars 11728 1726882188.74776: variable 'port2_profile' from source: play vars 11728 1726882188.74789: variable 'dhcp_interface2' from source: play vars 11728 1726882188.75002: variable 'dhcp_interface2' from source: play vars 11728 1726882188.75005: variable 'controller_profile' from source: play vars 11728 1726882188.75033: variable 'controller_profile' from source: play vars 11728 1726882188.75179: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882188.75280: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882188.75408: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882188.75547: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882188.76303: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882188.77075: variable 'network_connections' from source: include params 11728 1726882188.77088: variable 'controller_profile' from source: play vars 11728 1726882188.77159: variable 'controller_profile' from source: play vars 11728 1726882188.77300: variable 'controller_device' from source: play vars 11728 1726882188.77356: variable 'controller_device' from source: play vars 11728 1726882188.77410: variable 'port1_profile' from source: play vars 11728 1726882188.77600: variable 'port1_profile' from source: play vars 11728 1726882188.77603: variable 'dhcp_interface1' from source: play vars 11728 1726882188.77652: variable 'dhcp_interface1' from source: play vars 11728 1726882188.77708: variable 'controller_profile' from source: play vars 11728 1726882188.77901: variable 'controller_profile' from source: play vars 11728 1726882188.77904: variable 'port2_profile' from source: play vars 11728 1726882188.78015: variable 'port2_profile' from source: play vars 11728 1726882188.78028: variable 'dhcp_interface2' from source: play vars 11728 1726882188.78092: variable 'dhcp_interface2' from source: play vars 11728 1726882188.78161: variable 'controller_profile' from source: play vars 11728 1726882188.78300: variable 'controller_profile' from source: play vars 11728 1726882188.78303: variable 'ansible_distribution' from source: facts 11728 1726882188.78305: variable '__network_rh_distros' from source: role '' defaults 11728 1726882188.78308: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.78450: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882188.78754: variable 'ansible_distribution' from source: facts 11728 1726882188.78814: variable '__network_rh_distros' from source: role '' defaults 11728 1726882188.78826: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.79000: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882188.79228: variable 'ansible_distribution' from source: facts 11728 1726882188.79241: variable '__network_rh_distros' from source: role '' defaults 11728 1726882188.79357: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.79361: variable 'network_provider' from source: set_fact 11728 1726882188.79414: variable 'ansible_facts' from source: unknown 11728 1726882188.80801: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11728 1726882188.80990: when evaluation is False, skipping this task 11728 1726882188.80998: _execute() done 11728 1726882188.81001: dumping result to json 11728 1726882188.81003: done dumping result, returning 11728 1726882188.81006: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-5c28-a762-00000000027f] 11728 1726882188.81008: sending task result for task 12673a56-9f93-5c28-a762-00000000027f 11728 1726882188.81076: done sending task result for task 12673a56-9f93-5c28-a762-00000000027f 11728 1726882188.81081: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11728 1726882188.81141: no more pending results, returning what we have 11728 1726882188.81144: results queue empty 11728 1726882188.81145: checking for any_errors_fatal 11728 1726882188.81150: done checking for any_errors_fatal 11728 1726882188.81151: checking for max_fail_percentage 11728 1726882188.81153: done checking for max_fail_percentage 11728 1726882188.81153: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.81154: done checking to see if all hosts have failed 11728 1726882188.81155: getting the remaining hosts for this loop 11728 1726882188.81157: done getting the remaining hosts for this loop 11728 1726882188.81161: getting the next task for host managed_node3 11728 1726882188.81167: done getting next task for host managed_node3 11728 1726882188.81171: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882188.81176: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.81190: getting variables 11728 1726882188.81191: in VariableManager get_vars() 11728 1726882188.81231: Calling all_inventory to load vars for managed_node3 11728 1726882188.81235: Calling groups_inventory to load vars for managed_node3 11728 1726882188.81237: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.81246: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.81249: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.81252: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.93389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882188.95586: done with get_vars() 11728 1726882188.95623: done getting variables 11728 1726882188.95682: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:29:48 -0400 (0:00:00.421) 0:00:13.809 ****** 11728 1726882188.95719: entering _queue_task() for managed_node3/package 11728 1726882188.96068: worker is 1 (out of 1 available) 11728 1726882188.96082: exiting _queue_task() for managed_node3/package 11728 1726882188.96300: done queuing things up, now waiting for results queue to drain 11728 1726882188.96302: waiting for pending results... 11728 1726882188.96432: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882188.96607: in run() - task 12673a56-9f93-5c28-a762-000000000280 11728 1726882188.96613: variable 'ansible_search_path' from source: unknown 11728 1726882188.96616: variable 'ansible_search_path' from source: unknown 11728 1726882188.96653: calling self._execute() 11728 1726882188.96756: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882188.96769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882188.96784: variable 'omit' from source: magic vars 11728 1726882188.97231: variable 'ansible_distribution_major_version' from source: facts 11728 1726882188.97248: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882188.97379: variable 'network_state' from source: role '' defaults 11728 1726882188.97407: Evaluated conditional (network_state != {}): False 11728 1726882188.97415: when evaluation is False, skipping this task 11728 1726882188.97423: _execute() done 11728 1726882188.97431: dumping result to json 11728 1726882188.97438: done dumping result, returning 11728 1726882188.97450: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000280] 11728 1726882188.97463: sending task result for task 12673a56-9f93-5c28-a762-000000000280 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882188.97658: no more pending results, returning what we have 11728 1726882188.97662: results queue empty 11728 1726882188.97663: checking for any_errors_fatal 11728 1726882188.97674: done checking for any_errors_fatal 11728 1726882188.97674: checking for max_fail_percentage 11728 1726882188.97676: done checking for max_fail_percentage 11728 1726882188.97677: checking to see if all hosts have failed and the running result is not ok 11728 1726882188.97677: done checking to see if all hosts have failed 11728 1726882188.97678: getting the remaining hosts for this loop 11728 1726882188.97680: done getting the remaining hosts for this loop 11728 1726882188.97683: getting the next task for host managed_node3 11728 1726882188.97690: done getting next task for host managed_node3 11728 1726882188.97698: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882188.97704: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882188.97720: getting variables 11728 1726882188.97722: in VariableManager get_vars() 11728 1726882188.97760: Calling all_inventory to load vars for managed_node3 11728 1726882188.97763: Calling groups_inventory to load vars for managed_node3 11728 1726882188.97765: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882188.97777: Calling all_plugins_play to load vars for managed_node3 11728 1726882188.97781: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882188.97784: Calling groups_plugins_play to load vars for managed_node3 11728 1726882188.98474: done sending task result for task 12673a56-9f93-5c28-a762-000000000280 11728 1726882188.98478: WORKER PROCESS EXITING 11728 1726882189.00348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882189.03555: done with get_vars() 11728 1726882189.03579: done getting variables 11728 1726882189.03842: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:29:49 -0400 (0:00:00.081) 0:00:13.891 ****** 11728 1726882189.03876: entering _queue_task() for managed_node3/package 11728 1726882189.04434: worker is 1 (out of 1 available) 11728 1726882189.04445: exiting _queue_task() for managed_node3/package 11728 1726882189.04456: done queuing things up, now waiting for results queue to drain 11728 1726882189.04457: waiting for pending results... 11728 1726882189.04779: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882189.05139: in run() - task 12673a56-9f93-5c28-a762-000000000281 11728 1726882189.05142: variable 'ansible_search_path' from source: unknown 11728 1726882189.05145: variable 'ansible_search_path' from source: unknown 11728 1726882189.05151: calling self._execute() 11728 1726882189.05237: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882189.05302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882189.05306: variable 'omit' from source: magic vars 11728 1726882189.05862: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.05865: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882189.05982: variable 'network_state' from source: role '' defaults 11728 1726882189.06000: Evaluated conditional (network_state != {}): False 11728 1726882189.06027: when evaluation is False, skipping this task 11728 1726882189.06030: _execute() done 11728 1726882189.06033: dumping result to json 11728 1726882189.06034: done dumping result, returning 11728 1726882189.06137: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000281] 11728 1726882189.06141: sending task result for task 12673a56-9f93-5c28-a762-000000000281 11728 1726882189.06219: done sending task result for task 12673a56-9f93-5c28-a762-000000000281 11728 1726882189.06223: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882189.06269: no more pending results, returning what we have 11728 1726882189.06273: results queue empty 11728 1726882189.06274: checking for any_errors_fatal 11728 1726882189.06282: done checking for any_errors_fatal 11728 1726882189.06283: checking for max_fail_percentage 11728 1726882189.06284: done checking for max_fail_percentage 11728 1726882189.06285: checking to see if all hosts have failed and the running result is not ok 11728 1726882189.06285: done checking to see if all hosts have failed 11728 1726882189.06286: getting the remaining hosts for this loop 11728 1726882189.06470: done getting the remaining hosts for this loop 11728 1726882189.06475: getting the next task for host managed_node3 11728 1726882189.06482: done getting next task for host managed_node3 11728 1726882189.06486: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882189.06491: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882189.06507: getting variables 11728 1726882189.06509: in VariableManager get_vars() 11728 1726882189.06541: Calling all_inventory to load vars for managed_node3 11728 1726882189.06543: Calling groups_inventory to load vars for managed_node3 11728 1726882189.06546: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882189.06554: Calling all_plugins_play to load vars for managed_node3 11728 1726882189.06557: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882189.06560: Calling groups_plugins_play to load vars for managed_node3 11728 1726882189.08271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882189.10142: done with get_vars() 11728 1726882189.10170: done getting variables 11728 1726882189.10275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:29:49 -0400 (0:00:00.064) 0:00:13.955 ****** 11728 1726882189.10310: entering _queue_task() for managed_node3/service 11728 1726882189.10312: Creating lock for service 11728 1726882189.10740: worker is 1 (out of 1 available) 11728 1726882189.10751: exiting _queue_task() for managed_node3/service 11728 1726882189.10761: done queuing things up, now waiting for results queue to drain 11728 1726882189.10763: waiting for pending results... 11728 1726882189.11006: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882189.11233: in run() - task 12673a56-9f93-5c28-a762-000000000282 11728 1726882189.11237: variable 'ansible_search_path' from source: unknown 11728 1726882189.11243: variable 'ansible_search_path' from source: unknown 11728 1726882189.11262: calling self._execute() 11728 1726882189.11361: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882189.11372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882189.11422: variable 'omit' from source: magic vars 11728 1726882189.11814: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.11831: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882189.11963: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882189.12162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882189.15715: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882189.15719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882189.15721: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882189.15724: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882189.15726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882189.15728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.15749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.15773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.15820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.15834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.16086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.16089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.16091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.16098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.16100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.16103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.16105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.16107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.16110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.16112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.16402: variable 'network_connections' from source: include params 11728 1726882189.16406: variable 'controller_profile' from source: play vars 11728 1726882189.16409: variable 'controller_profile' from source: play vars 11728 1726882189.16412: variable 'controller_device' from source: play vars 11728 1726882189.16444: variable 'controller_device' from source: play vars 11728 1726882189.16456: variable 'port1_profile' from source: play vars 11728 1726882189.16523: variable 'port1_profile' from source: play vars 11728 1726882189.16530: variable 'dhcp_interface1' from source: play vars 11728 1726882189.16599: variable 'dhcp_interface1' from source: play vars 11728 1726882189.16612: variable 'controller_profile' from source: play vars 11728 1726882189.16672: variable 'controller_profile' from source: play vars 11728 1726882189.16676: variable 'port2_profile' from source: play vars 11728 1726882189.16930: variable 'port2_profile' from source: play vars 11728 1726882189.16936: variable 'dhcp_interface2' from source: play vars 11728 1726882189.16987: variable 'dhcp_interface2' from source: play vars 11728 1726882189.16997: variable 'controller_profile' from source: play vars 11728 1726882189.17086: variable 'controller_profile' from source: play vars 11728 1726882189.17182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882189.17587: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882189.17700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882189.17729: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882189.17757: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882189.17918: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882189.17942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882189.17965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.17998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882189.18166: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882189.18557: variable 'network_connections' from source: include params 11728 1726882189.18576: variable 'controller_profile' from source: play vars 11728 1726882189.18651: variable 'controller_profile' from source: play vars 11728 1726882189.18657: variable 'controller_device' from source: play vars 11728 1726882189.18717: variable 'controller_device' from source: play vars 11728 1726882189.18813: variable 'port1_profile' from source: play vars 11728 1726882189.18817: variable 'port1_profile' from source: play vars 11728 1726882189.18820: variable 'dhcp_interface1' from source: play vars 11728 1726882189.18856: variable 'dhcp_interface1' from source: play vars 11728 1726882189.18860: variable 'controller_profile' from source: play vars 11728 1726882189.18922: variable 'controller_profile' from source: play vars 11728 1726882189.18929: variable 'port2_profile' from source: play vars 11728 1726882189.18989: variable 'port2_profile' from source: play vars 11728 1726882189.18998: variable 'dhcp_interface2' from source: play vars 11728 1726882189.19053: variable 'dhcp_interface2' from source: play vars 11728 1726882189.19059: variable 'controller_profile' from source: play vars 11728 1726882189.19120: variable 'controller_profile' from source: play vars 11728 1726882189.19151: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882189.19155: when evaluation is False, skipping this task 11728 1726882189.19157: _execute() done 11728 1726882189.19159: dumping result to json 11728 1726882189.19162: done dumping result, returning 11728 1726882189.19171: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000282] 11728 1726882189.19178: sending task result for task 12673a56-9f93-5c28-a762-000000000282 11728 1726882189.19331: done sending task result for task 12673a56-9f93-5c28-a762-000000000282 11728 1726882189.19335: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882189.19384: no more pending results, returning what we have 11728 1726882189.19388: results queue empty 11728 1726882189.19389: checking for any_errors_fatal 11728 1726882189.19398: done checking for any_errors_fatal 11728 1726882189.19399: checking for max_fail_percentage 11728 1726882189.19401: done checking for max_fail_percentage 11728 1726882189.19402: checking to see if all hosts have failed and the running result is not ok 11728 1726882189.19403: done checking to see if all hosts have failed 11728 1726882189.19403: getting the remaining hosts for this loop 11728 1726882189.19405: done getting the remaining hosts for this loop 11728 1726882189.19410: getting the next task for host managed_node3 11728 1726882189.19418: done getting next task for host managed_node3 11728 1726882189.19422: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882189.19428: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882189.19442: getting variables 11728 1726882189.19444: in VariableManager get_vars() 11728 1726882189.19483: Calling all_inventory to load vars for managed_node3 11728 1726882189.19486: Calling groups_inventory to load vars for managed_node3 11728 1726882189.19489: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882189.19504: Calling all_plugins_play to load vars for managed_node3 11728 1726882189.19507: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882189.19511: Calling groups_plugins_play to load vars for managed_node3 11728 1726882189.21138: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882189.22240: done with get_vars() 11728 1726882189.22265: done getting variables 11728 1726882189.22313: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:29:49 -0400 (0:00:00.120) 0:00:14.075 ****** 11728 1726882189.22354: entering _queue_task() for managed_node3/service 11728 1726882189.22636: worker is 1 (out of 1 available) 11728 1726882189.22649: exiting _queue_task() for managed_node3/service 11728 1726882189.22660: done queuing things up, now waiting for results queue to drain 11728 1726882189.22662: waiting for pending results... 11728 1726882189.23011: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882189.23047: in run() - task 12673a56-9f93-5c28-a762-000000000283 11728 1726882189.23061: variable 'ansible_search_path' from source: unknown 11728 1726882189.23065: variable 'ansible_search_path' from source: unknown 11728 1726882189.23102: calling self._execute() 11728 1726882189.23199: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882189.23203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882189.23210: variable 'omit' from source: magic vars 11728 1726882189.23574: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.23585: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882189.23703: variable 'network_provider' from source: set_fact 11728 1726882189.23707: variable 'network_state' from source: role '' defaults 11728 1726882189.23716: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11728 1726882189.23721: variable 'omit' from source: magic vars 11728 1726882189.23761: variable 'omit' from source: magic vars 11728 1726882189.23782: variable 'network_service_name' from source: role '' defaults 11728 1726882189.23831: variable 'network_service_name' from source: role '' defaults 11728 1726882189.23901: variable '__network_provider_setup' from source: role '' defaults 11728 1726882189.23905: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882189.23952: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882189.23959: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882189.24004: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882189.24150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882189.25821: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882189.25824: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882189.25827: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882189.25829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882189.25849: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882189.25924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.25953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.25978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.26021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.26037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.26166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.26169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.26172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.26174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.26177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.26391: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882189.26499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.26522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.26544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.26581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.26597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.26677: variable 'ansible_python' from source: facts 11728 1726882189.26692: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882189.26771: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882189.26846: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882189.26962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.26985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.27050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.27054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.27056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.27103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882189.27127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882189.27165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.27185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882189.27204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882189.27324: variable 'network_connections' from source: include params 11728 1726882189.27330: variable 'controller_profile' from source: play vars 11728 1726882189.27388: variable 'controller_profile' from source: play vars 11728 1726882189.27415: variable 'controller_device' from source: play vars 11728 1726882189.27462: variable 'controller_device' from source: play vars 11728 1726882189.27479: variable 'port1_profile' from source: play vars 11728 1726882189.27531: variable 'port1_profile' from source: play vars 11728 1726882189.27540: variable 'dhcp_interface1' from source: play vars 11728 1726882189.27591: variable 'dhcp_interface1' from source: play vars 11728 1726882189.27607: variable 'controller_profile' from source: play vars 11728 1726882189.27656: variable 'controller_profile' from source: play vars 11728 1726882189.27664: variable 'port2_profile' from source: play vars 11728 1726882189.27720: variable 'port2_profile' from source: play vars 11728 1726882189.27729: variable 'dhcp_interface2' from source: play vars 11728 1726882189.27778: variable 'dhcp_interface2' from source: play vars 11728 1726882189.27786: variable 'controller_profile' from source: play vars 11728 1726882189.27843: variable 'controller_profile' from source: play vars 11728 1726882189.27918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882189.28050: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882189.28085: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882189.28121: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882189.28152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882189.28195: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882189.28218: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882189.28244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882189.28268: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882189.28310: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882189.28491: variable 'network_connections' from source: include params 11728 1726882189.28500: variable 'controller_profile' from source: play vars 11728 1726882189.28554: variable 'controller_profile' from source: play vars 11728 1726882189.28559: variable 'controller_device' from source: play vars 11728 1726882189.28623: variable 'controller_device' from source: play vars 11728 1726882189.28635: variable 'port1_profile' from source: play vars 11728 1726882189.28688: variable 'port1_profile' from source: play vars 11728 1726882189.28701: variable 'dhcp_interface1' from source: play vars 11728 1726882189.28750: variable 'dhcp_interface1' from source: play vars 11728 1726882189.28758: variable 'controller_profile' from source: play vars 11728 1726882189.28835: variable 'controller_profile' from source: play vars 11728 1726882189.28838: variable 'port2_profile' from source: play vars 11728 1726882189.28906: variable 'port2_profile' from source: play vars 11728 1726882189.28919: variable 'dhcp_interface2' from source: play vars 11728 1726882189.28985: variable 'dhcp_interface2' from source: play vars 11728 1726882189.29003: variable 'controller_profile' from source: play vars 11728 1726882189.29063: variable 'controller_profile' from source: play vars 11728 1726882189.29267: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882189.29270: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882189.29500: variable 'network_connections' from source: include params 11728 1726882189.29503: variable 'controller_profile' from source: play vars 11728 1726882189.29530: variable 'controller_profile' from source: play vars 11728 1726882189.29537: variable 'controller_device' from source: play vars 11728 1726882189.29604: variable 'controller_device' from source: play vars 11728 1726882189.29631: variable 'port1_profile' from source: play vars 11728 1726882189.29679: variable 'port1_profile' from source: play vars 11728 1726882189.29685: variable 'dhcp_interface1' from source: play vars 11728 1726882189.29753: variable 'dhcp_interface1' from source: play vars 11728 1726882189.29758: variable 'controller_profile' from source: play vars 11728 1726882189.29820: variable 'controller_profile' from source: play vars 11728 1726882189.29826: variable 'port2_profile' from source: play vars 11728 1726882189.29955: variable 'port2_profile' from source: play vars 11728 1726882189.29959: variable 'dhcp_interface2' from source: play vars 11728 1726882189.29977: variable 'dhcp_interface2' from source: play vars 11728 1726882189.29981: variable 'controller_profile' from source: play vars 11728 1726882189.30035: variable 'controller_profile' from source: play vars 11728 1726882189.30116: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882189.30131: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882189.30428: variable 'network_connections' from source: include params 11728 1726882189.30432: variable 'controller_profile' from source: play vars 11728 1726882189.30501: variable 'controller_profile' from source: play vars 11728 1726882189.30554: variable 'controller_device' from source: play vars 11728 1726882189.30570: variable 'controller_device' from source: play vars 11728 1726882189.30582: variable 'port1_profile' from source: play vars 11728 1726882189.30649: variable 'port1_profile' from source: play vars 11728 1726882189.30662: variable 'dhcp_interface1' from source: play vars 11728 1726882189.30725: variable 'dhcp_interface1' from source: play vars 11728 1726882189.30731: variable 'controller_profile' from source: play vars 11728 1726882189.30804: variable 'controller_profile' from source: play vars 11728 1726882189.30807: variable 'port2_profile' from source: play vars 11728 1726882189.30903: variable 'port2_profile' from source: play vars 11728 1726882189.30906: variable 'dhcp_interface2' from source: play vars 11728 1726882189.30949: variable 'dhcp_interface2' from source: play vars 11728 1726882189.30952: variable 'controller_profile' from source: play vars 11728 1726882189.31029: variable 'controller_profile' from source: play vars 11728 1726882189.31091: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882189.31131: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882189.31137: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882189.31179: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882189.31318: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882189.31614: variable 'network_connections' from source: include params 11728 1726882189.31618: variable 'controller_profile' from source: play vars 11728 1726882189.31662: variable 'controller_profile' from source: play vars 11728 1726882189.31668: variable 'controller_device' from source: play vars 11728 1726882189.31710: variable 'controller_device' from source: play vars 11728 1726882189.31720: variable 'port1_profile' from source: play vars 11728 1726882189.31762: variable 'port1_profile' from source: play vars 11728 1726882189.31768: variable 'dhcp_interface1' from source: play vars 11728 1726882189.31811: variable 'dhcp_interface1' from source: play vars 11728 1726882189.31817: variable 'controller_profile' from source: play vars 11728 1726882189.31862: variable 'controller_profile' from source: play vars 11728 1726882189.31865: variable 'port2_profile' from source: play vars 11728 1726882189.31907: variable 'port2_profile' from source: play vars 11728 1726882189.31913: variable 'dhcp_interface2' from source: play vars 11728 1726882189.31953: variable 'dhcp_interface2' from source: play vars 11728 1726882189.31958: variable 'controller_profile' from source: play vars 11728 1726882189.32003: variable 'controller_profile' from source: play vars 11728 1726882189.32010: variable 'ansible_distribution' from source: facts 11728 1726882189.32013: variable '__network_rh_distros' from source: role '' defaults 11728 1726882189.32018: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.32037: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882189.32148: variable 'ansible_distribution' from source: facts 11728 1726882189.32151: variable '__network_rh_distros' from source: role '' defaults 11728 1726882189.32156: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.32166: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882189.32275: variable 'ansible_distribution' from source: facts 11728 1726882189.32278: variable '__network_rh_distros' from source: role '' defaults 11728 1726882189.32281: variable 'ansible_distribution_major_version' from source: facts 11728 1726882189.32315: variable 'network_provider' from source: set_fact 11728 1726882189.32332: variable 'omit' from source: magic vars 11728 1726882189.32352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882189.32372: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882189.32386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882189.32401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882189.32410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882189.32435: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882189.32438: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882189.32440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882189.32502: Set connection var ansible_connection to ssh 11728 1726882189.32512: Set connection var ansible_shell_executable to /bin/sh 11728 1726882189.32517: Set connection var ansible_timeout to 10 11728 1726882189.32520: Set connection var ansible_shell_type to sh 11728 1726882189.32528: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882189.32532: Set connection var ansible_pipelining to False 11728 1726882189.32551: variable 'ansible_shell_executable' from source: unknown 11728 1726882189.32554: variable 'ansible_connection' from source: unknown 11728 1726882189.32556: variable 'ansible_module_compression' from source: unknown 11728 1726882189.32559: variable 'ansible_shell_type' from source: unknown 11728 1726882189.32561: variable 'ansible_shell_executable' from source: unknown 11728 1726882189.32563: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882189.32567: variable 'ansible_pipelining' from source: unknown 11728 1726882189.32569: variable 'ansible_timeout' from source: unknown 11728 1726882189.32573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882189.32645: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882189.32654: variable 'omit' from source: magic vars 11728 1726882189.32659: starting attempt loop 11728 1726882189.32662: running the handler 11728 1726882189.32715: variable 'ansible_facts' from source: unknown 11728 1726882189.33436: _low_level_execute_command(): starting 11728 1726882189.33439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882189.33982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.34002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.34059: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882189.34062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882189.34067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882189.34119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882189.35784: stdout chunk (state=3): >>>/root <<< 11728 1726882189.35883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882189.35914: stderr chunk (state=3): >>><<< 11728 1726882189.35917: stdout chunk (state=3): >>><<< 11728 1726882189.35933: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882189.35944: _low_level_execute_command(): starting 11728 1726882189.35949: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663 `" && echo ansible-tmp-1726882189.3593287-12408-262375263592663="` echo /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663 `" ) && sleep 0' 11728 1726882189.36350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.36354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.36382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.36428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882189.36432: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882189.36446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882189.36504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882189.38371: stdout chunk (state=3): >>>ansible-tmp-1726882189.3593287-12408-262375263592663=/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663 <<< 11728 1726882189.38482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882189.38505: stderr chunk (state=3): >>><<< 11728 1726882189.38508: stdout chunk (state=3): >>><<< 11728 1726882189.38520: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882189.3593287-12408-262375263592663=/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882189.38545: variable 'ansible_module_compression' from source: unknown 11728 1726882189.38587: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11728 1726882189.38591: ANSIBALLZ: Acquiring lock 11728 1726882189.38596: ANSIBALLZ: Lock acquired: 139840770723472 11728 1726882189.38598: ANSIBALLZ: Creating module 11728 1726882189.68629: ANSIBALLZ: Writing module into payload 11728 1726882189.68739: ANSIBALLZ: Writing module 11728 1726882189.68759: ANSIBALLZ: Renaming module 11728 1726882189.68765: ANSIBALLZ: Done creating module 11728 1726882189.68799: variable 'ansible_facts' from source: unknown 11728 1726882189.68939: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py 11728 1726882189.69039: Sending initial data 11728 1726882189.69043: Sent initial data (156 bytes) 11728 1726882189.69463: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.69466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.69469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882189.69471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882189.69473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.69517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882189.69529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882189.69583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882189.71274: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882189.71328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882189.71386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9qtuap7d /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py <<< 11728 1726882189.71389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py" <<< 11728 1726882189.71432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9qtuap7d" to remote "/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py" <<< 11728 1726882189.72501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882189.72536: stderr chunk (state=3): >>><<< 11728 1726882189.72539: stdout chunk (state=3): >>><<< 11728 1726882189.72564: done transferring module to remote 11728 1726882189.72573: _low_level_execute_command(): starting 11728 1726882189.72578: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/ /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py && sleep 0' 11728 1726882189.73081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882189.73085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882189.73087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.73090: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.73092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.73099: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882189.73128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882189.73172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882189.74886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882189.74910: stderr chunk (state=3): >>><<< 11728 1726882189.74914: stdout chunk (state=3): >>><<< 11728 1726882189.74925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882189.74928: _low_level_execute_command(): starting 11728 1726882189.74934: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/AnsiballZ_systemd.py && sleep 0' 11728 1726882189.75347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.75351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.75353: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882189.75355: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882189.75357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882189.75401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882189.75414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882189.75468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.03824: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10305536", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299307520", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "572356000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11728 1726882190.04132: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11728 1726882190.05635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.05651: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 11728 1726882190.05716: stderr chunk (state=3): >>><<< 11728 1726882190.05720: stdout chunk (state=3): >>><<< 11728 1726882190.05744: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10305536", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299307520", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "572356000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882190.05956: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882190.06007: _low_level_execute_command(): starting 11728 1726882190.06011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882189.3593287-12408-262375263592663/ > /dev/null 2>&1 && sleep 0' 11728 1726882190.07399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882190.07403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882190.07406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.07409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.07411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882190.07413: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.07681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.07747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.09731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.09734: stdout chunk (state=3): >>><<< 11728 1726882190.09737: stderr chunk (state=3): >>><<< 11728 1726882190.09739: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882190.09741: handler run complete 11728 1726882190.09743: attempt loop complete, returning result 11728 1726882190.09745: _execute() done 11728 1726882190.09746: dumping result to json 11728 1726882190.09748: done dumping result, returning 11728 1726882190.09750: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-5c28-a762-000000000283] 11728 1726882190.09751: sending task result for task 12673a56-9f93-5c28-a762-000000000283 11728 1726882190.10761: done sending task result for task 12673a56-9f93-5c28-a762-000000000283 11728 1726882190.10764: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882190.10817: no more pending results, returning what we have 11728 1726882190.10821: results queue empty 11728 1726882190.10822: checking for any_errors_fatal 11728 1726882190.10829: done checking for any_errors_fatal 11728 1726882190.10830: checking for max_fail_percentage 11728 1726882190.10831: done checking for max_fail_percentage 11728 1726882190.10832: checking to see if all hosts have failed and the running result is not ok 11728 1726882190.10833: done checking to see if all hosts have failed 11728 1726882190.10833: getting the remaining hosts for this loop 11728 1726882190.10835: done getting the remaining hosts for this loop 11728 1726882190.10839: getting the next task for host managed_node3 11728 1726882190.10846: done getting next task for host managed_node3 11728 1726882190.10850: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882190.10854: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882190.10864: getting variables 11728 1726882190.10866: in VariableManager get_vars() 11728 1726882190.11020: Calling all_inventory to load vars for managed_node3 11728 1726882190.11023: Calling groups_inventory to load vars for managed_node3 11728 1726882190.11026: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882190.11035: Calling all_plugins_play to load vars for managed_node3 11728 1726882190.11038: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882190.11040: Calling groups_plugins_play to load vars for managed_node3 11728 1726882190.12624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882190.14295: done with get_vars() 11728 1726882190.14398: done getting variables 11728 1726882190.14455: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:29:50 -0400 (0:00:00.922) 0:00:14.998 ****** 11728 1726882190.14608: entering _queue_task() for managed_node3/service 11728 1726882190.15181: worker is 1 (out of 1 available) 11728 1726882190.15197: exiting _queue_task() for managed_node3/service 11728 1726882190.15211: done queuing things up, now waiting for results queue to drain 11728 1726882190.15213: waiting for pending results... 11728 1726882190.15624: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882190.15790: in run() - task 12673a56-9f93-5c28-a762-000000000284 11728 1726882190.15796: variable 'ansible_search_path' from source: unknown 11728 1726882190.15799: variable 'ansible_search_path' from source: unknown 11728 1726882190.15900: calling self._execute() 11728 1726882190.15937: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.15948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.15961: variable 'omit' from source: magic vars 11728 1726882190.16373: variable 'ansible_distribution_major_version' from source: facts 11728 1726882190.16390: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882190.16512: variable 'network_provider' from source: set_fact 11728 1726882190.16524: Evaluated conditional (network_provider == "nm"): True 11728 1726882190.16626: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882190.16717: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882190.16987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882190.19929: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882190.20007: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882190.20055: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882190.20102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882190.20132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882190.20215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882190.20254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882190.20356: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882190.20359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882190.20362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882190.20406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882190.20432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882190.20456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882190.20502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882190.20520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882190.20572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882190.20592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882190.20682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882190.20685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882190.20688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882190.20854: variable 'network_connections' from source: include params 11728 1726882190.20870: variable 'controller_profile' from source: play vars 11728 1726882190.20951: variable 'controller_profile' from source: play vars 11728 1726882190.20966: variable 'controller_device' from source: play vars 11728 1726882190.21035: variable 'controller_device' from source: play vars 11728 1726882190.21057: variable 'port1_profile' from source: play vars 11728 1726882190.21126: variable 'port1_profile' from source: play vars 11728 1726882190.21138: variable 'dhcp_interface1' from source: play vars 11728 1726882190.21207: variable 'dhcp_interface1' from source: play vars 11728 1726882190.21260: variable 'controller_profile' from source: play vars 11728 1726882190.21291: variable 'controller_profile' from source: play vars 11728 1726882190.21306: variable 'port2_profile' from source: play vars 11728 1726882190.21375: variable 'port2_profile' from source: play vars 11728 1726882190.21388: variable 'dhcp_interface2' from source: play vars 11728 1726882190.21457: variable 'dhcp_interface2' from source: play vars 11728 1726882190.21485: variable 'controller_profile' from source: play vars 11728 1726882190.21556: variable 'controller_profile' from source: play vars 11728 1726882190.21661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882190.21813: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882190.21853: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882190.21890: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882190.21929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882190.21987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882190.22198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882190.22201: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882190.22212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882190.22214: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882190.22364: variable 'network_connections' from source: include params 11728 1726882190.22374: variable 'controller_profile' from source: play vars 11728 1726882190.22444: variable 'controller_profile' from source: play vars 11728 1726882190.22455: variable 'controller_device' from source: play vars 11728 1726882190.22517: variable 'controller_device' from source: play vars 11728 1726882190.22532: variable 'port1_profile' from source: play vars 11728 1726882190.22601: variable 'port1_profile' from source: play vars 11728 1726882190.22612: variable 'dhcp_interface1' from source: play vars 11728 1726882190.22679: variable 'dhcp_interface1' from source: play vars 11728 1726882190.22690: variable 'controller_profile' from source: play vars 11728 1726882190.22752: variable 'controller_profile' from source: play vars 11728 1726882190.22771: variable 'port2_profile' from source: play vars 11728 1726882190.22834: variable 'port2_profile' from source: play vars 11728 1726882190.22845: variable 'dhcp_interface2' from source: play vars 11728 1726882190.22914: variable 'dhcp_interface2' from source: play vars 11728 1726882190.22925: variable 'controller_profile' from source: play vars 11728 1726882190.23095: variable 'controller_profile' from source: play vars 11728 1726882190.23099: Evaluated conditional (__network_wpa_supplicant_required): False 11728 1726882190.23101: when evaluation is False, skipping this task 11728 1726882190.23103: _execute() done 11728 1726882190.23105: dumping result to json 11728 1726882190.23107: done dumping result, returning 11728 1726882190.23109: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-5c28-a762-000000000284] 11728 1726882190.23111: sending task result for task 12673a56-9f93-5c28-a762-000000000284 11728 1726882190.23182: done sending task result for task 12673a56-9f93-5c28-a762-000000000284 11728 1726882190.23186: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11728 1726882190.23246: no more pending results, returning what we have 11728 1726882190.23250: results queue empty 11728 1726882190.23252: checking for any_errors_fatal 11728 1726882190.23276: done checking for any_errors_fatal 11728 1726882190.23277: checking for max_fail_percentage 11728 1726882190.23279: done checking for max_fail_percentage 11728 1726882190.23280: checking to see if all hosts have failed and the running result is not ok 11728 1726882190.23281: done checking to see if all hosts have failed 11728 1726882190.23282: getting the remaining hosts for this loop 11728 1726882190.23283: done getting the remaining hosts for this loop 11728 1726882190.23287: getting the next task for host managed_node3 11728 1726882190.23297: done getting next task for host managed_node3 11728 1726882190.23301: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882190.23310: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882190.23325: getting variables 11728 1726882190.23327: in VariableManager get_vars() 11728 1726882190.23364: Calling all_inventory to load vars for managed_node3 11728 1726882190.23367: Calling groups_inventory to load vars for managed_node3 11728 1726882190.23370: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882190.23381: Calling all_plugins_play to load vars for managed_node3 11728 1726882190.23384: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882190.23388: Calling groups_plugins_play to load vars for managed_node3 11728 1726882190.25398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882190.27085: done with get_vars() 11728 1726882190.27112: done getting variables 11728 1726882190.27167: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:29:50 -0400 (0:00:00.125) 0:00:15.124 ****** 11728 1726882190.27200: entering _queue_task() for managed_node3/service 11728 1726882190.27504: worker is 1 (out of 1 available) 11728 1726882190.27515: exiting _queue_task() for managed_node3/service 11728 1726882190.27526: done queuing things up, now waiting for results queue to drain 11728 1726882190.27527: waiting for pending results... 11728 1726882190.27823: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882190.27938: in run() - task 12673a56-9f93-5c28-a762-000000000285 11728 1726882190.27957: variable 'ansible_search_path' from source: unknown 11728 1726882190.27965: variable 'ansible_search_path' from source: unknown 11728 1726882190.28006: calling self._execute() 11728 1726882190.28103: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.28115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.28135: variable 'omit' from source: magic vars 11728 1726882190.28572: variable 'ansible_distribution_major_version' from source: facts 11728 1726882190.28576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882190.28649: variable 'network_provider' from source: set_fact 11728 1726882190.28660: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882190.28667: when evaluation is False, skipping this task 11728 1726882190.28681: _execute() done 11728 1726882190.28688: dumping result to json 11728 1726882190.28698: done dumping result, returning 11728 1726882190.28711: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-5c28-a762-000000000285] 11728 1726882190.28721: sending task result for task 12673a56-9f93-5c28-a762-000000000285 11728 1726882190.28934: done sending task result for task 12673a56-9f93-5c28-a762-000000000285 11728 1726882190.28937: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882190.28982: no more pending results, returning what we have 11728 1726882190.28986: results queue empty 11728 1726882190.28987: checking for any_errors_fatal 11728 1726882190.29001: done checking for any_errors_fatal 11728 1726882190.29006: checking for max_fail_percentage 11728 1726882190.29008: done checking for max_fail_percentage 11728 1726882190.29009: checking to see if all hosts have failed and the running result is not ok 11728 1726882190.29010: done checking to see if all hosts have failed 11728 1726882190.29010: getting the remaining hosts for this loop 11728 1726882190.29012: done getting the remaining hosts for this loop 11728 1726882190.29016: getting the next task for host managed_node3 11728 1726882190.29023: done getting next task for host managed_node3 11728 1726882190.29028: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882190.29033: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882190.29049: getting variables 11728 1726882190.29051: in VariableManager get_vars() 11728 1726882190.29086: Calling all_inventory to load vars for managed_node3 11728 1726882190.29090: Calling groups_inventory to load vars for managed_node3 11728 1726882190.29092: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882190.29243: Calling all_plugins_play to load vars for managed_node3 11728 1726882190.29246: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882190.29250: Calling groups_plugins_play to load vars for managed_node3 11728 1726882190.30636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882190.32571: done with get_vars() 11728 1726882190.32592: done getting variables 11728 1726882190.32654: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:29:50 -0400 (0:00:00.054) 0:00:15.179 ****** 11728 1726882190.32687: entering _queue_task() for managed_node3/copy 11728 1726882190.33196: worker is 1 (out of 1 available) 11728 1726882190.33205: exiting _queue_task() for managed_node3/copy 11728 1726882190.33214: done queuing things up, now waiting for results queue to drain 11728 1726882190.33215: waiting for pending results... 11728 1726882190.33267: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882190.33442: in run() - task 12673a56-9f93-5c28-a762-000000000286 11728 1726882190.33445: variable 'ansible_search_path' from source: unknown 11728 1726882190.33448: variable 'ansible_search_path' from source: unknown 11728 1726882190.33464: calling self._execute() 11728 1726882190.33559: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.33569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.33582: variable 'omit' from source: magic vars 11728 1726882190.33986: variable 'ansible_distribution_major_version' from source: facts 11728 1726882190.33989: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882190.34083: variable 'network_provider' from source: set_fact 11728 1726882190.34203: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882190.34206: when evaluation is False, skipping this task 11728 1726882190.34208: _execute() done 11728 1726882190.34211: dumping result to json 11728 1726882190.34213: done dumping result, returning 11728 1726882190.34216: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-5c28-a762-000000000286] 11728 1726882190.34218: sending task result for task 12673a56-9f93-5c28-a762-000000000286 11728 1726882190.34288: done sending task result for task 12673a56-9f93-5c28-a762-000000000286 11728 1726882190.34292: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11728 1726882190.34350: no more pending results, returning what we have 11728 1726882190.34354: results queue empty 11728 1726882190.34355: checking for any_errors_fatal 11728 1726882190.34361: done checking for any_errors_fatal 11728 1726882190.34362: checking for max_fail_percentage 11728 1726882190.34364: done checking for max_fail_percentage 11728 1726882190.34364: checking to see if all hosts have failed and the running result is not ok 11728 1726882190.34365: done checking to see if all hosts have failed 11728 1726882190.34366: getting the remaining hosts for this loop 11728 1726882190.34367: done getting the remaining hosts for this loop 11728 1726882190.34371: getting the next task for host managed_node3 11728 1726882190.34379: done getting next task for host managed_node3 11728 1726882190.34382: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882190.34388: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882190.34404: getting variables 11728 1726882190.34405: in VariableManager get_vars() 11728 1726882190.34532: Calling all_inventory to load vars for managed_node3 11728 1726882190.34535: Calling groups_inventory to load vars for managed_node3 11728 1726882190.34538: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882190.34548: Calling all_plugins_play to load vars for managed_node3 11728 1726882190.34551: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882190.34555: Calling groups_plugins_play to load vars for managed_node3 11728 1726882190.36097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882190.37713: done with get_vars() 11728 1726882190.37733: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:29:50 -0400 (0:00:00.051) 0:00:15.230 ****** 11728 1726882190.37818: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882190.37819: Creating lock for fedora.linux_system_roles.network_connections 11728 1726882190.38085: worker is 1 (out of 1 available) 11728 1726882190.38099: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882190.38223: done queuing things up, now waiting for results queue to drain 11728 1726882190.38225: waiting for pending results... 11728 1726882190.38510: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882190.38539: in run() - task 12673a56-9f93-5c28-a762-000000000287 11728 1726882190.38608: variable 'ansible_search_path' from source: unknown 11728 1726882190.38612: variable 'ansible_search_path' from source: unknown 11728 1726882190.38614: calling self._execute() 11728 1726882190.38703: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.38720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.38734: variable 'omit' from source: magic vars 11728 1726882190.39112: variable 'ansible_distribution_major_version' from source: facts 11728 1726882190.39130: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882190.39150: variable 'omit' from source: magic vars 11728 1726882190.39259: variable 'omit' from source: magic vars 11728 1726882190.39386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882190.41691: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882190.41777: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882190.41821: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882190.41871: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882190.41906: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882190.42054: variable 'network_provider' from source: set_fact 11728 1726882190.42134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882190.42176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882190.42208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882190.42251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882190.42276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882190.42354: variable 'omit' from source: magic vars 11728 1726882190.42469: variable 'omit' from source: magic vars 11728 1726882190.42601: variable 'network_connections' from source: include params 11728 1726882190.42618: variable 'controller_profile' from source: play vars 11728 1726882190.42708: variable 'controller_profile' from source: play vars 11728 1726882190.42711: variable 'controller_device' from source: play vars 11728 1726882190.42761: variable 'controller_device' from source: play vars 11728 1726882190.42778: variable 'port1_profile' from source: play vars 11728 1726882190.42847: variable 'port1_profile' from source: play vars 11728 1726882190.42898: variable 'dhcp_interface1' from source: play vars 11728 1726882190.42930: variable 'dhcp_interface1' from source: play vars 11728 1726882190.42941: variable 'controller_profile' from source: play vars 11728 1726882190.43002: variable 'controller_profile' from source: play vars 11728 1726882190.43014: variable 'port2_profile' from source: play vars 11728 1726882190.43081: variable 'port2_profile' from source: play vars 11728 1726882190.43095: variable 'dhcp_interface2' from source: play vars 11728 1726882190.43164: variable 'dhcp_interface2' from source: play vars 11728 1726882190.43298: variable 'controller_profile' from source: play vars 11728 1726882190.43301: variable 'controller_profile' from source: play vars 11728 1726882190.43465: variable 'omit' from source: magic vars 11728 1726882190.43478: variable '__lsr_ansible_managed' from source: task vars 11728 1726882190.43563: variable '__lsr_ansible_managed' from source: task vars 11728 1726882190.43756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11728 1726882190.44799: Loaded config def from plugin (lookup/template) 11728 1726882190.44802: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11728 1726882190.44998: File lookup term: get_ansible_managed.j2 11728 1726882190.45001: variable 'ansible_search_path' from source: unknown 11728 1726882190.45003: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11728 1726882190.45007: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11728 1726882190.45009: variable 'ansible_search_path' from source: unknown 11728 1726882190.55713: variable 'ansible_managed' from source: unknown 11728 1726882190.55799: variable 'omit' from source: magic vars 11728 1726882190.55819: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882190.55838: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882190.55855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882190.55867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882190.55876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882190.55900: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882190.55904: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.55906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.55964: Set connection var ansible_connection to ssh 11728 1726882190.55971: Set connection var ansible_shell_executable to /bin/sh 11728 1726882190.55977: Set connection var ansible_timeout to 10 11728 1726882190.55980: Set connection var ansible_shell_type to sh 11728 1726882190.55986: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882190.56002: Set connection var ansible_pipelining to False 11728 1726882190.56014: variable 'ansible_shell_executable' from source: unknown 11728 1726882190.56018: variable 'ansible_connection' from source: unknown 11728 1726882190.56021: variable 'ansible_module_compression' from source: unknown 11728 1726882190.56023: variable 'ansible_shell_type' from source: unknown 11728 1726882190.56025: variable 'ansible_shell_executable' from source: unknown 11728 1726882190.56027: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882190.56032: variable 'ansible_pipelining' from source: unknown 11728 1726882190.56034: variable 'ansible_timeout' from source: unknown 11728 1726882190.56038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882190.56135: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882190.56143: variable 'omit' from source: magic vars 11728 1726882190.56150: starting attempt loop 11728 1726882190.56153: running the handler 11728 1726882190.56164: _low_level_execute_command(): starting 11728 1726882190.56169: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882190.56774: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.56825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.56851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.56938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.58599: stdout chunk (state=3): >>>/root <<< 11728 1726882190.58713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.58729: stderr chunk (state=3): >>><<< 11728 1726882190.58732: stdout chunk (state=3): >>><<< 11728 1726882190.58751: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882190.58761: _low_level_execute_command(): starting 11728 1726882190.58766: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821 `" && echo ansible-tmp-1726882190.5875068-12465-69439687270821="` echo /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821 `" ) && sleep 0' 11728 1726882190.59169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882190.59173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.59175: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882190.59177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882190.59179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.59224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.59230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.59280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.61123: stdout chunk (state=3): >>>ansible-tmp-1726882190.5875068-12465-69439687270821=/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821 <<< 11728 1726882190.61229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.61254: stderr chunk (state=3): >>><<< 11728 1726882190.61256: stdout chunk (state=3): >>><<< 11728 1726882190.61266: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882190.5875068-12465-69439687270821=/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882190.61399: variable 'ansible_module_compression' from source: unknown 11728 1726882190.61408: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11728 1726882190.61411: ANSIBALLZ: Acquiring lock 11728 1726882190.61413: ANSIBALLZ: Lock acquired: 139840766669648 11728 1726882190.61415: ANSIBALLZ: Creating module 11728 1726882190.76934: ANSIBALLZ: Writing module into payload 11728 1726882190.77156: ANSIBALLZ: Writing module 11728 1726882190.77174: ANSIBALLZ: Renaming module 11728 1726882190.77180: ANSIBALLZ: Done creating module 11728 1726882190.77204: variable 'ansible_facts' from source: unknown 11728 1726882190.77269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py 11728 1726882190.77371: Sending initial data 11728 1726882190.77374: Sent initial data (167 bytes) 11728 1726882190.77832: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882190.77835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882190.77837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.77839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882190.77841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882190.77843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.77881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.77899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.77962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.79595: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11728 1726882190.79599: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882190.79631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882190.79679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjqbsj34f /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py <<< 11728 1726882190.79682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py" <<< 11728 1726882190.79725: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjqbsj34f" to remote "/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py" <<< 11728 1726882190.79727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py" <<< 11728 1726882190.80448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.80496: stderr chunk (state=3): >>><<< 11728 1726882190.80500: stdout chunk (state=3): >>><<< 11728 1726882190.80518: done transferring module to remote 11728 1726882190.80527: _low_level_execute_command(): starting 11728 1726882190.80532: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/ /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py && sleep 0' 11728 1726882190.80956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882190.80964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882190.80986: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.80989: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882190.80991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.81051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882190.81054: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.81058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.81105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882190.82847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882190.82865: stderr chunk (state=3): >>><<< 11728 1726882190.82868: stdout chunk (state=3): >>><<< 11728 1726882190.82880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882190.82883: _low_level_execute_command(): starting 11728 1726882190.82890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/AnsiballZ_network_connections.py && sleep 0' 11728 1726882190.83440: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882190.83498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882190.83502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882190.83522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882190.83557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.23766: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11728 1726882191.25887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882191.25891: stdout chunk (state=3): >>><<< 11728 1726882191.25900: stderr chunk (state=3): >>><<< 11728 1726882191.25917: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "802.3ad", "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882191.25974: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': '802.3ad', 'ad_actor_sys_prio': 65535, 'ad_actor_system': '00:00:5e:00:53:5d', 'ad_select': 'stable', 'ad_user_port_key': 1023, 'all_ports_active': True, 'downdelay': 0, 'lacp_rate': 'slow', 'lp_interval': 128, 'miimon': 110, 'min_links': 0, 'num_grat_arp': 64, 'primary_reselect': 'better', 'resend_igmp': 225, 'updelay': 0, 'use_carrier': True, 'xmit_hash_policy': 'encap2+3'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882191.25981: _low_level_execute_command(): starting 11728 1726882191.25986: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882190.5875068-12465-69439687270821/ > /dev/null 2>&1 && sleep 0' 11728 1726882191.26417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882191.26421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.26424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882191.26426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.26481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882191.26488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882191.26491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.26532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.28368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882191.28389: stderr chunk (state=3): >>><<< 11728 1726882191.28396: stdout chunk (state=3): >>><<< 11728 1726882191.28412: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882191.28415: handler run complete 11728 1726882191.28453: attempt loop complete, returning result 11728 1726882191.28456: _execute() done 11728 1726882191.28458: dumping result to json 11728 1726882191.28465: done dumping result, returning 11728 1726882191.28473: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-5c28-a762-000000000287] 11728 1726882191.28477: sending task result for task 12673a56-9f93-5c28-a762-000000000287 11728 1726882191.28600: done sending task result for task 12673a56-9f93-5c28-a762-000000000287 11728 1726882191.28603: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active) 11728 1726882191.28754: no more pending results, returning what we have 11728 1726882191.28757: results queue empty 11728 1726882191.28758: checking for any_errors_fatal 11728 1726882191.28764: done checking for any_errors_fatal 11728 1726882191.28765: checking for max_fail_percentage 11728 1726882191.28766: done checking for max_fail_percentage 11728 1726882191.28767: checking to see if all hosts have failed and the running result is not ok 11728 1726882191.28768: done checking to see if all hosts have failed 11728 1726882191.28768: getting the remaining hosts for this loop 11728 1726882191.28770: done getting the remaining hosts for this loop 11728 1726882191.28773: getting the next task for host managed_node3 11728 1726882191.28780: done getting next task for host managed_node3 11728 1726882191.28783: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882191.28786: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882191.28804: getting variables 11728 1726882191.28806: in VariableManager get_vars() 11728 1726882191.28840: Calling all_inventory to load vars for managed_node3 11728 1726882191.28843: Calling groups_inventory to load vars for managed_node3 11728 1726882191.28845: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882191.28854: Calling all_plugins_play to load vars for managed_node3 11728 1726882191.28856: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882191.28858: Calling groups_plugins_play to load vars for managed_node3 11728 1726882191.29762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882191.30609: done with get_vars() 11728 1726882191.30625: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:29:51 -0400 (0:00:00.928) 0:00:16.159 ****** 11728 1726882191.30687: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882191.30688: Creating lock for fedora.linux_system_roles.network_state 11728 1726882191.30920: worker is 1 (out of 1 available) 11728 1726882191.30935: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882191.30947: done queuing things up, now waiting for results queue to drain 11728 1726882191.30949: waiting for pending results... 11728 1726882191.31118: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882191.31206: in run() - task 12673a56-9f93-5c28-a762-000000000288 11728 1726882191.31218: variable 'ansible_search_path' from source: unknown 11728 1726882191.31222: variable 'ansible_search_path' from source: unknown 11728 1726882191.31255: calling self._execute() 11728 1726882191.31339: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.31343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.31352: variable 'omit' from source: magic vars 11728 1726882191.31617: variable 'ansible_distribution_major_version' from source: facts 11728 1726882191.31626: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882191.31711: variable 'network_state' from source: role '' defaults 11728 1726882191.31719: Evaluated conditional (network_state != {}): False 11728 1726882191.31722: when evaluation is False, skipping this task 11728 1726882191.31725: _execute() done 11728 1726882191.31729: dumping result to json 11728 1726882191.31731: done dumping result, returning 11728 1726882191.31736: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-5c28-a762-000000000288] 11728 1726882191.31741: sending task result for task 12673a56-9f93-5c28-a762-000000000288 11728 1726882191.31827: done sending task result for task 12673a56-9f93-5c28-a762-000000000288 11728 1726882191.31830: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882191.31889: no more pending results, returning what we have 11728 1726882191.31896: results queue empty 11728 1726882191.31898: checking for any_errors_fatal 11728 1726882191.31907: done checking for any_errors_fatal 11728 1726882191.31908: checking for max_fail_percentage 11728 1726882191.31909: done checking for max_fail_percentage 11728 1726882191.31910: checking to see if all hosts have failed and the running result is not ok 11728 1726882191.31911: done checking to see if all hosts have failed 11728 1726882191.31911: getting the remaining hosts for this loop 11728 1726882191.31913: done getting the remaining hosts for this loop 11728 1726882191.31916: getting the next task for host managed_node3 11728 1726882191.31922: done getting next task for host managed_node3 11728 1726882191.31925: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882191.31930: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882191.31942: getting variables 11728 1726882191.31943: in VariableManager get_vars() 11728 1726882191.31970: Calling all_inventory to load vars for managed_node3 11728 1726882191.31972: Calling groups_inventory to load vars for managed_node3 11728 1726882191.31974: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882191.31982: Calling all_plugins_play to load vars for managed_node3 11728 1726882191.31984: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882191.31987: Calling groups_plugins_play to load vars for managed_node3 11728 1726882191.32712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882191.34131: done with get_vars() 11728 1726882191.34145: done getting variables 11728 1726882191.34187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:29:51 -0400 (0:00:00.035) 0:00:16.194 ****** 11728 1726882191.34214: entering _queue_task() for managed_node3/debug 11728 1726882191.34404: worker is 1 (out of 1 available) 11728 1726882191.34416: exiting _queue_task() for managed_node3/debug 11728 1726882191.34429: done queuing things up, now waiting for results queue to drain 11728 1726882191.34430: waiting for pending results... 11728 1726882191.34589: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882191.34662: in run() - task 12673a56-9f93-5c28-a762-000000000289 11728 1726882191.34674: variable 'ansible_search_path' from source: unknown 11728 1726882191.34677: variable 'ansible_search_path' from source: unknown 11728 1726882191.34707: calling self._execute() 11728 1726882191.34773: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.34778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.34787: variable 'omit' from source: magic vars 11728 1726882191.35038: variable 'ansible_distribution_major_version' from source: facts 11728 1726882191.35047: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882191.35053: variable 'omit' from source: magic vars 11728 1726882191.35101: variable 'omit' from source: magic vars 11728 1726882191.35120: variable 'omit' from source: magic vars 11728 1726882191.35149: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882191.35174: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882191.35190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882191.35209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.35218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.35242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882191.35245: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.35248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.35314: Set connection var ansible_connection to ssh 11728 1726882191.35322: Set connection var ansible_shell_executable to /bin/sh 11728 1726882191.35327: Set connection var ansible_timeout to 10 11728 1726882191.35330: Set connection var ansible_shell_type to sh 11728 1726882191.35336: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882191.35341: Set connection var ansible_pipelining to False 11728 1726882191.35358: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.35361: variable 'ansible_connection' from source: unknown 11728 1726882191.35364: variable 'ansible_module_compression' from source: unknown 11728 1726882191.35366: variable 'ansible_shell_type' from source: unknown 11728 1726882191.35368: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.35370: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.35373: variable 'ansible_pipelining' from source: unknown 11728 1726882191.35375: variable 'ansible_timeout' from source: unknown 11728 1726882191.35380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.35478: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882191.35486: variable 'omit' from source: magic vars 11728 1726882191.35491: starting attempt loop 11728 1726882191.35498: running the handler 11728 1726882191.35797: variable '__network_connections_result' from source: set_fact 11728 1726882191.35800: handler run complete 11728 1726882191.35803: attempt loop complete, returning result 11728 1726882191.35805: _execute() done 11728 1726882191.35807: dumping result to json 11728 1726882191.35809: done dumping result, returning 11728 1726882191.35812: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-5c28-a762-000000000289] 11728 1726882191.35813: sending task result for task 12673a56-9f93-5c28-a762-000000000289 11728 1726882191.35872: done sending task result for task 12673a56-9f93-5c28-a762-000000000289 11728 1726882191.35875: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)" ] } 11728 1726882191.36024: no more pending results, returning what we have 11728 1726882191.36027: results queue empty 11728 1726882191.36027: checking for any_errors_fatal 11728 1726882191.36032: done checking for any_errors_fatal 11728 1726882191.36033: checking for max_fail_percentage 11728 1726882191.36034: done checking for max_fail_percentage 11728 1726882191.36035: checking to see if all hosts have failed and the running result is not ok 11728 1726882191.36035: done checking to see if all hosts have failed 11728 1726882191.36036: getting the remaining hosts for this loop 11728 1726882191.36037: done getting the remaining hosts for this loop 11728 1726882191.36040: getting the next task for host managed_node3 11728 1726882191.36045: done getting next task for host managed_node3 11728 1726882191.36048: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882191.36051: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882191.36059: getting variables 11728 1726882191.36061: in VariableManager get_vars() 11728 1726882191.36089: Calling all_inventory to load vars for managed_node3 11728 1726882191.36092: Calling groups_inventory to load vars for managed_node3 11728 1726882191.36301: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882191.36309: Calling all_plugins_play to load vars for managed_node3 11728 1726882191.36316: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882191.36319: Calling groups_plugins_play to load vars for managed_node3 11728 1726882191.37692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882191.39697: done with get_vars() 11728 1726882191.39721: done getting variables 11728 1726882191.39779: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:29:51 -0400 (0:00:00.055) 0:00:16.250 ****** 11728 1726882191.39812: entering _queue_task() for managed_node3/debug 11728 1726882191.40072: worker is 1 (out of 1 available) 11728 1726882191.40084: exiting _queue_task() for managed_node3/debug 11728 1726882191.40209: done queuing things up, now waiting for results queue to drain 11728 1726882191.40211: waiting for pending results... 11728 1726882191.40379: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882191.40506: in run() - task 12673a56-9f93-5c28-a762-00000000028a 11728 1726882191.40530: variable 'ansible_search_path' from source: unknown 11728 1726882191.40545: variable 'ansible_search_path' from source: unknown 11728 1726882191.40588: calling self._execute() 11728 1726882191.40697: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.40710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.40724: variable 'omit' from source: magic vars 11728 1726882191.41125: variable 'ansible_distribution_major_version' from source: facts 11728 1726882191.41143: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882191.41154: variable 'omit' from source: magic vars 11728 1726882191.41229: variable 'omit' from source: magic vars 11728 1726882191.41265: variable 'omit' from source: magic vars 11728 1726882191.41411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882191.41415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882191.41417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882191.41420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.41422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.41532: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882191.41542: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.41559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.42009: Set connection var ansible_connection to ssh 11728 1726882191.42012: Set connection var ansible_shell_executable to /bin/sh 11728 1726882191.42015: Set connection var ansible_timeout to 10 11728 1726882191.42017: Set connection var ansible_shell_type to sh 11728 1726882191.42019: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882191.42021: Set connection var ansible_pipelining to False 11728 1726882191.42023: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.42025: variable 'ansible_connection' from source: unknown 11728 1726882191.42027: variable 'ansible_module_compression' from source: unknown 11728 1726882191.42029: variable 'ansible_shell_type' from source: unknown 11728 1726882191.42417: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.42421: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.42423: variable 'ansible_pipelining' from source: unknown 11728 1726882191.42426: variable 'ansible_timeout' from source: unknown 11728 1726882191.42428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.42743: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882191.42961: variable 'omit' from source: magic vars 11728 1726882191.42963: starting attempt loop 11728 1726882191.42965: running the handler 11728 1726882191.42967: variable '__network_connections_result' from source: set_fact 11728 1726882191.43140: variable '__network_connections_result' from source: set_fact 11728 1726882191.43400: handler run complete 11728 1726882191.43641: attempt loop complete, returning result 11728 1726882191.43649: _execute() done 11728 1726882191.43656: dumping result to json 11728 1726882191.43733: done dumping result, returning 11728 1726882191.43747: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-5c28-a762-00000000028a] 11728 1726882191.43756: sending task result for task 12673a56-9f93-5c28-a762-00000000028a ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)" ] } } 11728 1726882191.43999: no more pending results, returning what we have 11728 1726882191.44003: results queue empty 11728 1726882191.44004: checking for any_errors_fatal 11728 1726882191.44010: done checking for any_errors_fatal 11728 1726882191.44010: checking for max_fail_percentage 11728 1726882191.44012: done checking for max_fail_percentage 11728 1726882191.44013: checking to see if all hosts have failed and the running result is not ok 11728 1726882191.44014: done checking to see if all hosts have failed 11728 1726882191.44014: getting the remaining hosts for this loop 11728 1726882191.44016: done getting the remaining hosts for this loop 11728 1726882191.44021: getting the next task for host managed_node3 11728 1726882191.44028: done getting next task for host managed_node3 11728 1726882191.44032: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882191.44036: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882191.44047: getting variables 11728 1726882191.44048: in VariableManager get_vars() 11728 1726882191.44081: Calling all_inventory to load vars for managed_node3 11728 1726882191.44084: Calling groups_inventory to load vars for managed_node3 11728 1726882191.44087: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882191.44301: Calling all_plugins_play to load vars for managed_node3 11728 1726882191.44306: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882191.44312: done sending task result for task 12673a56-9f93-5c28-a762-00000000028a 11728 1726882191.44314: WORKER PROCESS EXITING 11728 1726882191.44318: Calling groups_plugins_play to load vars for managed_node3 11728 1726882191.46944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882191.49878: done with get_vars() 11728 1726882191.49999: done getting variables 11728 1726882191.50062: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:29:51 -0400 (0:00:00.102) 0:00:16.353 ****** 11728 1726882191.50107: entering _queue_task() for managed_node3/debug 11728 1726882191.50420: worker is 1 (out of 1 available) 11728 1726882191.50435: exiting _queue_task() for managed_node3/debug 11728 1726882191.50449: done queuing things up, now waiting for results queue to drain 11728 1726882191.50451: waiting for pending results... 11728 1726882191.50874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882191.50986: in run() - task 12673a56-9f93-5c28-a762-00000000028b 11728 1726882191.51219: variable 'ansible_search_path' from source: unknown 11728 1726882191.51224: variable 'ansible_search_path' from source: unknown 11728 1726882191.51259: calling self._execute() 11728 1726882191.51353: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.51357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.51366: variable 'omit' from source: magic vars 11728 1726882191.52234: variable 'ansible_distribution_major_version' from source: facts 11728 1726882191.52252: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882191.52400: variable 'network_state' from source: role '' defaults 11728 1726882191.52416: Evaluated conditional (network_state != {}): False 11728 1726882191.52424: when evaluation is False, skipping this task 11728 1726882191.52432: _execute() done 11728 1726882191.52438: dumping result to json 11728 1726882191.52496: done dumping result, returning 11728 1726882191.52501: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-5c28-a762-00000000028b] 11728 1726882191.52503: sending task result for task 12673a56-9f93-5c28-a762-00000000028b skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11728 1726882191.52648: no more pending results, returning what we have 11728 1726882191.52653: results queue empty 11728 1726882191.52654: checking for any_errors_fatal 11728 1726882191.52669: done checking for any_errors_fatal 11728 1726882191.52669: checking for max_fail_percentage 11728 1726882191.52672: done checking for max_fail_percentage 11728 1726882191.52672: checking to see if all hosts have failed and the running result is not ok 11728 1726882191.52673: done checking to see if all hosts have failed 11728 1726882191.52674: getting the remaining hosts for this loop 11728 1726882191.52676: done getting the remaining hosts for this loop 11728 1726882191.52680: getting the next task for host managed_node3 11728 1726882191.52687: done getting next task for host managed_node3 11728 1726882191.52691: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882191.52699: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882191.52718: getting variables 11728 1726882191.52719: in VariableManager get_vars() 11728 1726882191.52759: Calling all_inventory to load vars for managed_node3 11728 1726882191.52762: Calling groups_inventory to load vars for managed_node3 11728 1726882191.52765: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882191.52777: Calling all_plugins_play to load vars for managed_node3 11728 1726882191.52780: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882191.52783: Calling groups_plugins_play to load vars for managed_node3 11728 1726882191.53411: done sending task result for task 12673a56-9f93-5c28-a762-00000000028b 11728 1726882191.53414: WORKER PROCESS EXITING 11728 1726882191.54565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882191.57071: done with get_vars() 11728 1726882191.57106: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:29:51 -0400 (0:00:00.070) 0:00:16.424 ****** 11728 1726882191.57206: entering _queue_task() for managed_node3/ping 11728 1726882191.57209: Creating lock for ping 11728 1726882191.57656: worker is 1 (out of 1 available) 11728 1726882191.57668: exiting _queue_task() for managed_node3/ping 11728 1726882191.57678: done queuing things up, now waiting for results queue to drain 11728 1726882191.57680: waiting for pending results... 11728 1726882191.57860: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882191.58047: in run() - task 12673a56-9f93-5c28-a762-00000000028c 11728 1726882191.58069: variable 'ansible_search_path' from source: unknown 11728 1726882191.58077: variable 'ansible_search_path' from source: unknown 11728 1726882191.58125: calling self._execute() 11728 1726882191.58601: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.58605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.58608: variable 'omit' from source: magic vars 11728 1726882191.59146: variable 'ansible_distribution_major_version' from source: facts 11728 1726882191.59165: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882191.59177: variable 'omit' from source: magic vars 11728 1726882191.59242: variable 'omit' from source: magic vars 11728 1726882191.59279: variable 'omit' from source: magic vars 11728 1726882191.59399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882191.59436: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882191.59512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882191.59563: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.59586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882191.59626: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882191.59635: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.59643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.59760: Set connection var ansible_connection to ssh 11728 1726882191.59777: Set connection var ansible_shell_executable to /bin/sh 11728 1726882191.59787: Set connection var ansible_timeout to 10 11728 1726882191.59801: Set connection var ansible_shell_type to sh 11728 1726882191.59815: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882191.59825: Set connection var ansible_pipelining to False 11728 1726882191.59853: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.59861: variable 'ansible_connection' from source: unknown 11728 1726882191.59868: variable 'ansible_module_compression' from source: unknown 11728 1726882191.59874: variable 'ansible_shell_type' from source: unknown 11728 1726882191.59881: variable 'ansible_shell_executable' from source: unknown 11728 1726882191.59887: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882191.59900: variable 'ansible_pipelining' from source: unknown 11728 1726882191.59908: variable 'ansible_timeout' from source: unknown 11728 1726882191.59916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882191.60128: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882191.60225: variable 'omit' from source: magic vars 11728 1726882191.60228: starting attempt loop 11728 1726882191.60230: running the handler 11728 1726882191.60233: _low_level_execute_command(): starting 11728 1726882191.60235: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882191.60909: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882191.60924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882191.60941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882191.61001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.61064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882191.61082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882191.61115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.61232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.62891: stdout chunk (state=3): >>>/root <<< 11728 1726882191.63026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882191.63049: stdout chunk (state=3): >>><<< 11728 1726882191.63053: stderr chunk (state=3): >>><<< 11728 1726882191.63160: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882191.63164: _low_level_execute_command(): starting 11728 1726882191.63168: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046 `" && echo ansible-tmp-1726882191.6307347-12526-253791188430046="` echo /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046 `" ) && sleep 0' 11728 1726882191.63654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882191.63668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882191.63681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882191.63699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882191.63717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882191.63738: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882191.63750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.63814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.63857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882191.63873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882191.63892: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.63967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.65982: stdout chunk (state=3): >>>ansible-tmp-1726882191.6307347-12526-253791188430046=/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046 <<< 11728 1726882191.66211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882191.66409: stderr chunk (state=3): >>><<< 11728 1726882191.66413: stdout chunk (state=3): >>><<< 11728 1726882191.66435: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882191.6307347-12526-253791188430046=/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882191.66489: variable 'ansible_module_compression' from source: unknown 11728 1726882191.66535: ANSIBALLZ: Using lock for ping 11728 1726882191.66540: ANSIBALLZ: Acquiring lock 11728 1726882191.66542: ANSIBALLZ: Lock acquired: 139840769290944 11728 1726882191.66544: ANSIBALLZ: Creating module 11728 1726882191.79147: ANSIBALLZ: Writing module into payload 11728 1726882191.79196: ANSIBALLZ: Writing module 11728 1726882191.79254: ANSIBALLZ: Renaming module 11728 1726882191.79262: ANSIBALLZ: Done creating module 11728 1726882191.79264: variable 'ansible_facts' from source: unknown 11728 1726882191.79313: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py 11728 1726882191.79552: Sending initial data 11728 1726882191.79555: Sent initial data (153 bytes) 11728 1726882191.80100: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882191.80103: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882191.80106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882191.80108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882191.80195: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882191.80202: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882191.80206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882191.80263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.80311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.81931: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882191.81976: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882191.82032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt8_ixmkl /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py <<< 11728 1726882191.82035: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py" <<< 11728 1726882191.82074: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt8_ixmkl" to remote "/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py" <<< 11728 1726882191.82803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882191.82981: stderr chunk (state=3): >>><<< 11728 1726882191.82985: stdout chunk (state=3): >>><<< 11728 1726882191.82987: done transferring module to remote 11728 1726882191.82989: _low_level_execute_command(): starting 11728 1726882191.82991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/ /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py && sleep 0' 11728 1726882191.83657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882191.83660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882191.83757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.83761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.83844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882191.85615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882191.85633: stderr chunk (state=3): >>><<< 11728 1726882191.85645: stdout chunk (state=3): >>><<< 11728 1726882191.85741: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882191.85745: _low_level_execute_command(): starting 11728 1726882191.85747: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/AnsiballZ_ping.py && sleep 0' 11728 1726882191.86342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882191.86346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882191.86348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882191.86369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882191.86450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.01301: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11728 1726882192.02650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882192.02810: stdout chunk (state=3): >>><<< 11728 1726882192.02814: stderr chunk (state=3): >>><<< 11728 1726882192.02946: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882192.02950: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882192.02953: _low_level_execute_command(): starting 11728 1726882192.02955: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882191.6307347-12526-253791188430046/ > /dev/null 2>&1 && sleep 0' 11728 1726882192.03517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882192.03526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.03536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.03550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882192.03563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882192.03570: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882192.03580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.03626: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882192.03630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882192.03633: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882192.03636: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.03642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.03645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882192.03647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882192.03649: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882192.03732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.03736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882192.03738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882192.03764: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.03832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.05861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.05865: stdout chunk (state=3): >>><<< 11728 1726882192.05867: stderr chunk (state=3): >>><<< 11728 1726882192.05869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882192.05872: handler run complete 11728 1726882192.05874: attempt loop complete, returning result 11728 1726882192.05876: _execute() done 11728 1726882192.05878: dumping result to json 11728 1726882192.05880: done dumping result, returning 11728 1726882192.05882: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-5c28-a762-00000000028c] 11728 1726882192.05883: sending task result for task 12673a56-9f93-5c28-a762-00000000028c ok: [managed_node3] => { "changed": false, "ping": "pong" } 11728 1726882192.06167: no more pending results, returning what we have 11728 1726882192.06171: results queue empty 11728 1726882192.06172: checking for any_errors_fatal 11728 1726882192.06177: done checking for any_errors_fatal 11728 1726882192.06178: checking for max_fail_percentage 11728 1726882192.06180: done checking for max_fail_percentage 11728 1726882192.06181: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.06182: done checking to see if all hosts have failed 11728 1726882192.06182: getting the remaining hosts for this loop 11728 1726882192.06184: done getting the remaining hosts for this loop 11728 1726882192.06188: getting the next task for host managed_node3 11728 1726882192.06209: done getting next task for host managed_node3 11728 1726882192.06212: ^ task is: TASK: meta (role_complete) 11728 1726882192.06216: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.06226: getting variables 11728 1726882192.06228: in VariableManager get_vars() 11728 1726882192.06264: Calling all_inventory to load vars for managed_node3 11728 1726882192.06266: Calling groups_inventory to load vars for managed_node3 11728 1726882192.06269: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.06278: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.06281: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.06283: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.06811: done sending task result for task 12673a56-9f93-5c28-a762-00000000028c 11728 1726882192.06815: WORKER PROCESS EXITING 11728 1726882192.07945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.09688: done with get_vars() 11728 1726882192.09714: done getting variables 11728 1726882192.09816: done queuing things up, now waiting for results queue to drain 11728 1726882192.09818: results queue empty 11728 1726882192.09819: checking for any_errors_fatal 11728 1726882192.09821: done checking for any_errors_fatal 11728 1726882192.09822: checking for max_fail_percentage 11728 1726882192.09823: done checking for max_fail_percentage 11728 1726882192.09823: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.09824: done checking to see if all hosts have failed 11728 1726882192.09825: getting the remaining hosts for this loop 11728 1726882192.09825: done getting the remaining hosts for this loop 11728 1726882192.09828: getting the next task for host managed_node3 11728 1726882192.09832: done getting next task for host managed_node3 11728 1726882192.09833: ^ task is: TASK: Show result 11728 1726882192.09835: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.09838: getting variables 11728 1726882192.09838: in VariableManager get_vars() 11728 1726882192.09847: Calling all_inventory to load vars for managed_node3 11728 1726882192.09850: Calling groups_inventory to load vars for managed_node3 11728 1726882192.09852: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.09860: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.09863: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.09866: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.11718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.13645: done with get_vars() 11728 1726882192.13668: done getting variables 11728 1726882192.13827: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile.yml:46 Friday 20 September 2024 21:29:52 -0400 (0:00:00.566) 0:00:16.990 ****** 11728 1726882192.13858: entering _queue_task() for managed_node3/debug 11728 1726882192.14592: worker is 1 (out of 1 available) 11728 1726882192.14609: exiting _queue_task() for managed_node3/debug 11728 1726882192.14621: done queuing things up, now waiting for results queue to drain 11728 1726882192.14622: waiting for pending results... 11728 1726882192.14871: running TaskExecutor() for managed_node3/TASK: Show result 11728 1726882192.15004: in run() - task 12673a56-9f93-5c28-a762-0000000001c6 11728 1726882192.15025: variable 'ansible_search_path' from source: unknown 11728 1726882192.15036: variable 'ansible_search_path' from source: unknown 11728 1726882192.15082: calling self._execute() 11728 1726882192.15182: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.15260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.15264: variable 'omit' from source: magic vars 11728 1726882192.15575: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.15602: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.15614: variable 'omit' from source: magic vars 11728 1726882192.15662: variable 'omit' from source: magic vars 11728 1726882192.15709: variable 'omit' from source: magic vars 11728 1726882192.15751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882192.15791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882192.15822: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882192.15843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.15860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.15909: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882192.15912: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.16018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.16023: Set connection var ansible_connection to ssh 11728 1726882192.16035: Set connection var ansible_shell_executable to /bin/sh 11728 1726882192.16045: Set connection var ansible_timeout to 10 11728 1726882192.16051: Set connection var ansible_shell_type to sh 11728 1726882192.16061: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882192.16069: Set connection var ansible_pipelining to False 11728 1726882192.16102: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.16111: variable 'ansible_connection' from source: unknown 11728 1726882192.16123: variable 'ansible_module_compression' from source: unknown 11728 1726882192.16133: variable 'ansible_shell_type' from source: unknown 11728 1726882192.16140: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.16145: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.16152: variable 'ansible_pipelining' from source: unknown 11728 1726882192.16157: variable 'ansible_timeout' from source: unknown 11728 1726882192.16164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.16310: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882192.16327: variable 'omit' from source: magic vars 11728 1726882192.16336: starting attempt loop 11728 1726882192.16350: running the handler 11728 1726882192.16403: variable '__network_connections_result' from source: set_fact 11728 1726882192.16500: variable '__network_connections_result' from source: set_fact 11728 1726882192.16736: handler run complete 11728 1726882192.16786: attempt loop complete, returning result 11728 1726882192.16892: _execute() done 11728 1726882192.16897: dumping result to json 11728 1726882192.16899: done dumping result, returning 11728 1726882192.16902: done running TaskExecutor() for managed_node3/TASK: Show result [12673a56-9f93-5c28-a762-0000000001c6] 11728 1726882192.16905: sending task result for task 12673a56-9f93-5c28-a762-0000000001c6 11728 1726882192.16988: done sending task result for task 12673a56-9f93-5c28-a762-0000000001c6 11728 1726882192.16998: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "ad_actor_sys_prio": 65535, "ad_actor_system": "00:00:5e:00:53:5d", "ad_select": "stable", "ad_user_port_key": 1023, "all_ports_active": true, "downdelay": 0, "lacp_rate": "slow", "lp_interval": 128, "miimon": 110, "min_links": 0, "mode": "802.3ad", "num_grat_arp": 64, "primary_reselect": "better", "resend_igmp": 225, "updelay": 0, "use_carrier": true, "xmit_hash_policy": "encap2+3" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a9f108ff-93a0-4692-a961-7fb7246e6129 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7846a1e6-a3d1-419f-996a-824f28a6a5c0 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 595a8ae5-0b4e-4403-aa9d-4e632858ba4c (not-active)" ] } } 11728 1726882192.17117: no more pending results, returning what we have 11728 1726882192.17121: results queue empty 11728 1726882192.17122: checking for any_errors_fatal 11728 1726882192.17124: done checking for any_errors_fatal 11728 1726882192.17125: checking for max_fail_percentage 11728 1726882192.17127: done checking for max_fail_percentage 11728 1726882192.17127: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.17128: done checking to see if all hosts have failed 11728 1726882192.17128: getting the remaining hosts for this loop 11728 1726882192.17130: done getting the remaining hosts for this loop 11728 1726882192.17133: getting the next task for host managed_node3 11728 1726882192.17141: done getting next task for host managed_node3 11728 1726882192.17144: ^ task is: TASK: Asserts 11728 1726882192.17147: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.17151: getting variables 11728 1726882192.17153: in VariableManager get_vars() 11728 1726882192.17181: Calling all_inventory to load vars for managed_node3 11728 1726882192.17184: Calling groups_inventory to load vars for managed_node3 11728 1726882192.17187: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.17200: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.17203: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.17205: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.18670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.20222: done with get_vars() 11728 1726882192.20254: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:29:52 -0400 (0:00:00.064) 0:00:17.055 ****** 11728 1726882192.20355: entering _queue_task() for managed_node3/include_tasks 11728 1726882192.20912: worker is 1 (out of 1 available) 11728 1726882192.20922: exiting _queue_task() for managed_node3/include_tasks 11728 1726882192.20933: done queuing things up, now waiting for results queue to drain 11728 1726882192.20935: waiting for pending results... 11728 1726882192.21019: running TaskExecutor() for managed_node3/TASK: Asserts 11728 1726882192.21162: in run() - task 12673a56-9f93-5c28-a762-00000000008d 11728 1726882192.21169: variable 'ansible_search_path' from source: unknown 11728 1726882192.21271: variable 'ansible_search_path' from source: unknown 11728 1726882192.21275: variable 'lsr_assert' from source: include params 11728 1726882192.21437: variable 'lsr_assert' from source: include params 11728 1726882192.21520: variable 'omit' from source: magic vars 11728 1726882192.21653: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.21667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.21680: variable 'omit' from source: magic vars 11728 1726882192.21914: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.21939: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.21949: variable 'item' from source: unknown 11728 1726882192.22011: variable 'item' from source: unknown 11728 1726882192.22052: variable 'item' from source: unknown 11728 1726882192.22112: variable 'item' from source: unknown 11728 1726882192.22611: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.22614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.22617: variable 'omit' from source: magic vars 11728 1726882192.22619: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.22621: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.22622: variable 'item' from source: unknown 11728 1726882192.22624: variable 'item' from source: unknown 11728 1726882192.22625: variable 'item' from source: unknown 11728 1726882192.22659: variable 'item' from source: unknown 11728 1726882192.22785: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.22803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.22816: variable 'omit' from source: magic vars 11728 1726882192.22974: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.22984: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.22995: variable 'item' from source: unknown 11728 1726882192.23153: variable 'item' from source: unknown 11728 1726882192.23157: variable 'item' from source: unknown 11728 1726882192.23159: variable 'item' from source: unknown 11728 1726882192.23231: dumping result to json 11728 1726882192.23240: done dumping result, returning 11728 1726882192.23249: done running TaskExecutor() for managed_node3/TASK: Asserts [12673a56-9f93-5c28-a762-00000000008d] 11728 1726882192.23264: sending task result for task 12673a56-9f93-5c28-a762-00000000008d 11728 1726882192.23371: done sending task result for task 12673a56-9f93-5c28-a762-00000000008d 11728 1726882192.23375: WORKER PROCESS EXITING 11728 1726882192.23414: no more pending results, returning what we have 11728 1726882192.23420: in VariableManager get_vars() 11728 1726882192.23456: Calling all_inventory to load vars for managed_node3 11728 1726882192.23459: Calling groups_inventory to load vars for managed_node3 11728 1726882192.23463: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.23482: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.23486: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.23490: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.25042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.26621: done with get_vars() 11728 1726882192.26640: variable 'ansible_search_path' from source: unknown 11728 1726882192.26647: variable 'ansible_search_path' from source: unknown 11728 1726882192.26689: variable 'ansible_search_path' from source: unknown 11728 1726882192.26690: variable 'ansible_search_path' from source: unknown 11728 1726882192.26724: variable 'ansible_search_path' from source: unknown 11728 1726882192.26725: variable 'ansible_search_path' from source: unknown 11728 1726882192.26760: we have included files to process 11728 1726882192.26761: generating all_blocks data 11728 1726882192.26763: done generating all_blocks data 11728 1726882192.26768: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11728 1726882192.26769: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11728 1726882192.26771: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml 11728 1726882192.26937: in VariableManager get_vars() 11728 1726882192.26956: done with get_vars() 11728 1726882192.26963: variable 'item' from source: include params 11728 1726882192.27071: variable 'item' from source: include params 11728 1726882192.27109: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11728 1726882192.27187: in VariableManager get_vars() 11728 1726882192.27211: done with get_vars() 11728 1726882192.27334: done processing included file 11728 1726882192.27336: iterating over new_blocks loaded from include file 11728 1726882192.27338: in VariableManager get_vars() 11728 1726882192.27351: done with get_vars() 11728 1726882192.27352: filtering new block on tags 11728 1726882192.27396: done filtering new block on tags 11728 1726882192.27399: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_controller_device_present.yml for managed_node3 => (item=tasks/assert_controller_device_present.yml) 11728 1726882192.27403: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11728 1726882192.27404: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11728 1726882192.27407: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml 11728 1726882192.27539: in VariableManager get_vars() 11728 1726882192.27556: done with get_vars() 11728 1726882192.27567: done processing included file 11728 1726882192.27569: iterating over new_blocks loaded from include file 11728 1726882192.27570: in VariableManager get_vars() 11728 1726882192.27583: done with get_vars() 11728 1726882192.27584: filtering new block on tags 11728 1726882192.27607: done filtering new block on tags 11728 1726882192.27609: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml for managed_node3 => (item=tasks/assert_bond_port_profile_present.yml) 11728 1726882192.27613: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882192.27614: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882192.27625: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882192.27820: in VariableManager get_vars() 11728 1726882192.27832: done with get_vars() 11728 1726882192.27862: in VariableManager get_vars() 11728 1726882192.27879: done with get_vars() 11728 1726882192.27887: done processing included file 11728 1726882192.27888: iterating over new_blocks loaded from include file 11728 1726882192.27889: in VariableManager get_vars() 11728 1726882192.27898: done with get_vars() 11728 1726882192.27900: filtering new block on tags 11728 1726882192.27922: done filtering new block on tags 11728 1726882192.27923: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 11728 1726882192.27926: extending task lists for all hosts with included blocks 11728 1726882192.28767: done extending task lists 11728 1726882192.28768: done processing included files 11728 1726882192.28768: results queue empty 11728 1726882192.28769: checking for any_errors_fatal 11728 1726882192.28773: done checking for any_errors_fatal 11728 1726882192.28773: checking for max_fail_percentage 11728 1726882192.28774: done checking for max_fail_percentage 11728 1726882192.28774: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.28775: done checking to see if all hosts have failed 11728 1726882192.28775: getting the remaining hosts for this loop 11728 1726882192.28776: done getting the remaining hosts for this loop 11728 1726882192.28778: getting the next task for host managed_node3 11728 1726882192.28781: done getting next task for host managed_node3 11728 1726882192.28782: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11728 1726882192.28784: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.28785: getting variables 11728 1726882192.28786: in VariableManager get_vars() 11728 1726882192.28792: Calling all_inventory to load vars for managed_node3 11728 1726882192.28795: Calling groups_inventory to load vars for managed_node3 11728 1726882192.28797: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.28801: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.28802: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.28804: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.29403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.30383: done with get_vars() 11728 1726882192.30400: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:52 -0400 (0:00:00.100) 0:00:17.156 ****** 11728 1726882192.30452: entering _queue_task() for managed_node3/include_tasks 11728 1726882192.30688: worker is 1 (out of 1 available) 11728 1726882192.30701: exiting _queue_task() for managed_node3/include_tasks 11728 1726882192.30715: done queuing things up, now waiting for results queue to drain 11728 1726882192.30717: waiting for pending results... 11728 1726882192.30881: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11728 1726882192.30987: in run() - task 12673a56-9f93-5c28-a762-0000000003f5 11728 1726882192.31200: variable 'ansible_search_path' from source: unknown 11728 1726882192.31204: variable 'ansible_search_path' from source: unknown 11728 1726882192.31208: calling self._execute() 11728 1726882192.31211: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.31214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.31217: variable 'omit' from source: magic vars 11728 1726882192.31533: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.31572: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.31582: _execute() done 11728 1726882192.31588: dumping result to json 11728 1726882192.31601: done dumping result, returning 11728 1726882192.31612: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-5c28-a762-0000000003f5] 11728 1726882192.31622: sending task result for task 12673a56-9f93-5c28-a762-0000000003f5 11728 1726882192.31757: no more pending results, returning what we have 11728 1726882192.31763: in VariableManager get_vars() 11728 1726882192.31803: Calling all_inventory to load vars for managed_node3 11728 1726882192.31806: Calling groups_inventory to load vars for managed_node3 11728 1726882192.31809: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.31821: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.31823: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.31826: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.32356: done sending task result for task 12673a56-9f93-5c28-a762-0000000003f5 11728 1726882192.32360: WORKER PROCESS EXITING 11728 1726882192.33103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.37922: done with get_vars() 11728 1726882192.37941: variable 'ansible_search_path' from source: unknown 11728 1726882192.37942: variable 'ansible_search_path' from source: unknown 11728 1726882192.37976: we have included files to process 11728 1726882192.37977: generating all_blocks data 11728 1726882192.37978: done generating all_blocks data 11728 1726882192.37979: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882192.37980: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882192.37981: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882192.38332: done processing included file 11728 1726882192.38334: iterating over new_blocks loaded from include file 11728 1726882192.38335: in VariableManager get_vars() 11728 1726882192.38351: done with get_vars() 11728 1726882192.38353: filtering new block on tags 11728 1726882192.38378: done filtering new block on tags 11728 1726882192.38380: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11728 1726882192.38384: extending task lists for all hosts with included blocks 11728 1726882192.38791: done extending task lists 11728 1726882192.38795: done processing included files 11728 1726882192.38795: results queue empty 11728 1726882192.38796: checking for any_errors_fatal 11728 1726882192.38798: done checking for any_errors_fatal 11728 1726882192.38799: checking for max_fail_percentage 11728 1726882192.38800: done checking for max_fail_percentage 11728 1726882192.38801: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.38802: done checking to see if all hosts have failed 11728 1726882192.38802: getting the remaining hosts for this loop 11728 1726882192.38803: done getting the remaining hosts for this loop 11728 1726882192.38806: getting the next task for host managed_node3 11728 1726882192.38810: done getting next task for host managed_node3 11728 1726882192.38812: ^ task is: TASK: Get stat for interface {{ interface }} 11728 1726882192.38815: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.38818: getting variables 11728 1726882192.38818: in VariableManager get_vars() 11728 1726882192.38828: Calling all_inventory to load vars for managed_node3 11728 1726882192.38830: Calling groups_inventory to load vars for managed_node3 11728 1726882192.38832: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.38837: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.38840: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.38842: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.41144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.42992: done with get_vars() 11728 1726882192.43016: done getting variables 11728 1726882192.43155: variable 'interface' from source: task vars 11728 1726882192.43158: variable 'controller_device' from source: play vars 11728 1726882192.43217: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:52 -0400 (0:00:00.127) 0:00:17.284 ****** 11728 1726882192.43248: entering _queue_task() for managed_node3/stat 11728 1726882192.43571: worker is 1 (out of 1 available) 11728 1726882192.43583: exiting _queue_task() for managed_node3/stat 11728 1726882192.43597: done queuing things up, now waiting for results queue to drain 11728 1726882192.43599: waiting for pending results... 11728 1726882192.43914: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 11728 1726882192.44002: in run() - task 12673a56-9f93-5c28-a762-0000000004af 11728 1726882192.44065: variable 'ansible_search_path' from source: unknown 11728 1726882192.44068: variable 'ansible_search_path' from source: unknown 11728 1726882192.44072: calling self._execute() 11728 1726882192.44173: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.44186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.44207: variable 'omit' from source: magic vars 11728 1726882192.44649: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.44666: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.44690: variable 'omit' from source: magic vars 11728 1726882192.44838: variable 'omit' from source: magic vars 11728 1726882192.44878: variable 'interface' from source: task vars 11728 1726882192.44888: variable 'controller_device' from source: play vars 11728 1726882192.44962: variable 'controller_device' from source: play vars 11728 1726882192.44987: variable 'omit' from source: magic vars 11728 1726882192.45036: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882192.45105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882192.45130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882192.45164: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.45182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.45222: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882192.45232: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.45282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.45384: Set connection var ansible_connection to ssh 11728 1726882192.45405: Set connection var ansible_shell_executable to /bin/sh 11728 1726882192.45424: Set connection var ansible_timeout to 10 11728 1726882192.45440: Set connection var ansible_shell_type to sh 11728 1726882192.45507: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882192.45511: Set connection var ansible_pipelining to False 11728 1726882192.45514: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.45516: variable 'ansible_connection' from source: unknown 11728 1726882192.45518: variable 'ansible_module_compression' from source: unknown 11728 1726882192.45526: variable 'ansible_shell_type' from source: unknown 11728 1726882192.45533: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.45540: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.45548: variable 'ansible_pipelining' from source: unknown 11728 1726882192.45556: variable 'ansible_timeout' from source: unknown 11728 1726882192.45564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.45930: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882192.45934: variable 'omit' from source: magic vars 11728 1726882192.45937: starting attempt loop 11728 1726882192.45939: running the handler 11728 1726882192.45941: _low_level_execute_command(): starting 11728 1726882192.45943: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882192.46687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.46777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.46834: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882192.46848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.46899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.48574: stdout chunk (state=3): >>>/root <<< 11728 1726882192.48731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.48735: stdout chunk (state=3): >>><<< 11728 1726882192.48737: stderr chunk (state=3): >>><<< 11728 1726882192.48804: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882192.48809: _low_level_execute_command(): starting 11728 1726882192.48812: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725 `" && echo ansible-tmp-1726882192.48759-12588-7659102941725="` echo /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725 `" ) && sleep 0' 11728 1726882192.49355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882192.49370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.49401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.49424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882192.49442: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.49501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882192.49521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.49586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.51501: stdout chunk (state=3): >>>ansible-tmp-1726882192.48759-12588-7659102941725=/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725 <<< 11728 1726882192.51632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.51649: stdout chunk (state=3): >>><<< 11728 1726882192.51664: stderr chunk (state=3): >>><<< 11728 1726882192.51681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882192.48759-12588-7659102941725=/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882192.51730: variable 'ansible_module_compression' from source: unknown 11728 1726882192.51803: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882192.51852: variable 'ansible_facts' from source: unknown 11728 1726882192.51947: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py 11728 1726882192.52169: Sending initial data 11728 1726882192.52186: Sent initial data (149 bytes) 11728 1726882192.52736: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.52744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882192.52749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.52756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.52759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.52801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882192.52811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.52860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.54434: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882192.54475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882192.54524: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpm5iegoxi /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py <<< 11728 1726882192.54527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py" <<< 11728 1726882192.54573: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpm5iegoxi" to remote "/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py" <<< 11728 1726882192.55262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.55301: stderr chunk (state=3): >>><<< 11728 1726882192.55399: stdout chunk (state=3): >>><<< 11728 1726882192.55404: done transferring module to remote 11728 1726882192.55406: _low_level_execute_command(): starting 11728 1726882192.55408: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/ /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py && sleep 0' 11728 1726882192.55863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.55877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882192.55887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.55936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882192.55952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.55998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.57792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.57803: stdout chunk (state=3): >>><<< 11728 1726882192.57806: stderr chunk (state=3): >>><<< 11728 1726882192.57889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882192.57896: _low_level_execute_command(): starting 11728 1726882192.57900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/AnsiballZ_stat.py && sleep 0' 11728 1726882192.58308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.58320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882192.58331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.58374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882192.58397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.58438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.73610: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28236, "dev": 23, "nlink": 1, "atime": 1726882191.091223, "mtime": 1726882191.091223, "ctime": 1726882191.091223, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882192.74889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882192.74934: stderr chunk (state=3): >>><<< 11728 1726882192.74937: stdout chunk (state=3): >>><<< 11728 1726882192.74958: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28236, "dev": 23, "nlink": 1, "atime": 1726882191.091223, "mtime": 1726882191.091223, "ctime": 1726882191.091223, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882192.74998: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882192.75006: _low_level_execute_command(): starting 11728 1726882192.75011: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882192.48759-12588-7659102941725/ > /dev/null 2>&1 && sleep 0' 11728 1726882192.75462: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.75465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.75471: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882192.75473: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882192.75475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882192.75530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882192.75537: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882192.75538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882192.75581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882192.77410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882192.77432: stderr chunk (state=3): >>><<< 11728 1726882192.77435: stdout chunk (state=3): >>><<< 11728 1726882192.77448: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882192.77454: handler run complete 11728 1726882192.77487: attempt loop complete, returning result 11728 1726882192.77490: _execute() done 11728 1726882192.77492: dumping result to json 11728 1726882192.77501: done dumping result, returning 11728 1726882192.77509: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [12673a56-9f93-5c28-a762-0000000004af] 11728 1726882192.77513: sending task result for task 12673a56-9f93-5c28-a762-0000000004af 11728 1726882192.77621: done sending task result for task 12673a56-9f93-5c28-a762-0000000004af 11728 1726882192.77624: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882191.091223, "block_size": 4096, "blocks": 0, "ctime": 1726882191.091223, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28236, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882191.091223, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11728 1726882192.77708: no more pending results, returning what we have 11728 1726882192.77713: results queue empty 11728 1726882192.77714: checking for any_errors_fatal 11728 1726882192.77715: done checking for any_errors_fatal 11728 1726882192.77716: checking for max_fail_percentage 11728 1726882192.77718: done checking for max_fail_percentage 11728 1726882192.77718: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.77719: done checking to see if all hosts have failed 11728 1726882192.77720: getting the remaining hosts for this loop 11728 1726882192.77721: done getting the remaining hosts for this loop 11728 1726882192.77724: getting the next task for host managed_node3 11728 1726882192.77736: done getting next task for host managed_node3 11728 1726882192.77739: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11728 1726882192.77742: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.77747: getting variables 11728 1726882192.77748: in VariableManager get_vars() 11728 1726882192.77778: Calling all_inventory to load vars for managed_node3 11728 1726882192.77781: Calling groups_inventory to load vars for managed_node3 11728 1726882192.77784: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.77798: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.77801: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.77804: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.78614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.80354: done with get_vars() 11728 1726882192.80374: done getting variables 11728 1726882192.80437: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882192.80558: variable 'interface' from source: task vars 11728 1726882192.80562: variable 'controller_device' from source: play vars 11728 1726882192.80625: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:52 -0400 (0:00:00.374) 0:00:17.659 ****** 11728 1726882192.80660: entering _queue_task() for managed_node3/assert 11728 1726882192.80944: worker is 1 (out of 1 available) 11728 1726882192.80955: exiting _queue_task() for managed_node3/assert 11728 1726882192.80965: done queuing things up, now waiting for results queue to drain 11728 1726882192.80966: waiting for pending results... 11728 1726882192.81310: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 11728 1726882192.81401: in run() - task 12673a56-9f93-5c28-a762-0000000003f6 11728 1726882192.81406: variable 'ansible_search_path' from source: unknown 11728 1726882192.81409: variable 'ansible_search_path' from source: unknown 11728 1726882192.81412: calling self._execute() 11728 1726882192.81508: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.81512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.81522: variable 'omit' from source: magic vars 11728 1726882192.82098: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.82102: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.82105: variable 'omit' from source: magic vars 11728 1726882192.82107: variable 'omit' from source: magic vars 11728 1726882192.82109: variable 'interface' from source: task vars 11728 1726882192.82111: variable 'controller_device' from source: play vars 11728 1726882192.82123: variable 'controller_device' from source: play vars 11728 1726882192.82146: variable 'omit' from source: magic vars 11728 1726882192.82186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882192.82234: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882192.82259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882192.82280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.82304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882192.82339: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882192.82348: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.82356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.82456: Set connection var ansible_connection to ssh 11728 1726882192.82470: Set connection var ansible_shell_executable to /bin/sh 11728 1726882192.82478: Set connection var ansible_timeout to 10 11728 1726882192.82483: Set connection var ansible_shell_type to sh 11728 1726882192.82491: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882192.82504: Set connection var ansible_pipelining to False 11728 1726882192.82532: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.82539: variable 'ansible_connection' from source: unknown 11728 1726882192.82544: variable 'ansible_module_compression' from source: unknown 11728 1726882192.82549: variable 'ansible_shell_type' from source: unknown 11728 1726882192.82553: variable 'ansible_shell_executable' from source: unknown 11728 1726882192.82558: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.82563: variable 'ansible_pipelining' from source: unknown 11728 1726882192.82568: variable 'ansible_timeout' from source: unknown 11728 1726882192.82574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.82724: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882192.82746: variable 'omit' from source: magic vars 11728 1726882192.82757: starting attempt loop 11728 1726882192.82763: running the handler 11728 1726882192.82904: variable 'interface_stat' from source: set_fact 11728 1726882192.82930: Evaluated conditional (interface_stat.stat.exists): True 11728 1726882192.82940: handler run complete 11728 1726882192.82964: attempt loop complete, returning result 11728 1726882192.82971: _execute() done 11728 1726882192.82977: dumping result to json 11728 1726882192.82984: done dumping result, returning 11728 1726882192.83000: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [12673a56-9f93-5c28-a762-0000000003f6] 11728 1726882192.83011: sending task result for task 12673a56-9f93-5c28-a762-0000000003f6 11728 1726882192.83201: done sending task result for task 12673a56-9f93-5c28-a762-0000000003f6 11728 1726882192.83204: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882192.83256: no more pending results, returning what we have 11728 1726882192.83260: results queue empty 11728 1726882192.83262: checking for any_errors_fatal 11728 1726882192.83272: done checking for any_errors_fatal 11728 1726882192.83273: checking for max_fail_percentage 11728 1726882192.83275: done checking for max_fail_percentage 11728 1726882192.83276: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.83277: done checking to see if all hosts have failed 11728 1726882192.83278: getting the remaining hosts for this loop 11728 1726882192.83280: done getting the remaining hosts for this loop 11728 1726882192.83284: getting the next task for host managed_node3 11728 1726882192.83300: done getting next task for host managed_node3 11728 1726882192.83303: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11728 1726882192.83308: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.83313: getting variables 11728 1726882192.83315: in VariableManager get_vars() 11728 1726882192.83351: Calling all_inventory to load vars for managed_node3 11728 1726882192.83354: Calling groups_inventory to load vars for managed_node3 11728 1726882192.83358: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.83369: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.83373: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.83376: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.84806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.86379: done with get_vars() 11728 1726882192.86401: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_port_profile_present.yml:3 Friday 20 September 2024 21:29:52 -0400 (0:00:00.058) 0:00:17.717 ****** 11728 1726882192.86485: entering _queue_task() for managed_node3/include_tasks 11728 1726882192.86741: worker is 1 (out of 1 available) 11728 1726882192.86753: exiting _queue_task() for managed_node3/include_tasks 11728 1726882192.86764: done queuing things up, now waiting for results queue to drain 11728 1726882192.86765: waiting for pending results... 11728 1726882192.87033: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 11728 1726882192.87200: in run() - task 12673a56-9f93-5c28-a762-0000000003fb 11728 1726882192.87204: variable 'ansible_search_path' from source: unknown 11728 1726882192.87206: variable 'ansible_search_path' from source: unknown 11728 1726882192.87209: variable 'controller_profile' from source: play vars 11728 1726882192.87398: variable 'controller_profile' from source: play vars 11728 1726882192.87418: variable 'port1_profile' from source: play vars 11728 1726882192.87488: variable 'port1_profile' from source: play vars 11728 1726882192.87505: variable 'port2_profile' from source: play vars 11728 1726882192.87574: variable 'port2_profile' from source: play vars 11728 1726882192.87589: variable 'omit' from source: magic vars 11728 1726882192.87762: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.87766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.87773: variable 'omit' from source: magic vars 11728 1726882192.88022: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.88200: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.88204: variable 'bond_port_profile' from source: unknown 11728 1726882192.88206: variable 'bond_port_profile' from source: unknown 11728 1726882192.88400: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.88404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.88406: variable 'omit' from source: magic vars 11728 1726882192.88483: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.88496: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.88538: variable 'bond_port_profile' from source: unknown 11728 1726882192.88608: variable 'bond_port_profile' from source: unknown 11728 1726882192.88751: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.88755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.88860: variable 'omit' from source: magic vars 11728 1726882192.88926: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.88937: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.88974: variable 'bond_port_profile' from source: unknown 11728 1726882192.89042: variable 'bond_port_profile' from source: unknown 11728 1726882192.89131: dumping result to json 11728 1726882192.89141: done dumping result, returning 11728 1726882192.89201: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-5c28-a762-0000000003fb] 11728 1726882192.89204: sending task result for task 12673a56-9f93-5c28-a762-0000000003fb 11728 1726882192.89522: no more pending results, returning what we have 11728 1726882192.89528: in VariableManager get_vars() 11728 1726882192.89557: Calling all_inventory to load vars for managed_node3 11728 1726882192.89560: Calling groups_inventory to load vars for managed_node3 11728 1726882192.89563: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.89572: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.89575: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.89578: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.90207: done sending task result for task 12673a56-9f93-5c28-a762-0000000003fb 11728 1726882192.90211: WORKER PROCESS EXITING 11728 1726882192.90980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.92558: done with get_vars() 11728 1726882192.92576: variable 'ansible_search_path' from source: unknown 11728 1726882192.92577: variable 'ansible_search_path' from source: unknown 11728 1726882192.92586: variable 'item' from source: include params 11728 1726882192.92686: variable 'item' from source: include params 11728 1726882192.92728: variable 'ansible_search_path' from source: unknown 11728 1726882192.92729: variable 'ansible_search_path' from source: unknown 11728 1726882192.92735: variable 'item' from source: include params 11728 1726882192.92798: variable 'item' from source: include params 11728 1726882192.92830: variable 'ansible_search_path' from source: unknown 11728 1726882192.92831: variable 'ansible_search_path' from source: unknown 11728 1726882192.92837: variable 'item' from source: include params 11728 1726882192.92891: variable 'item' from source: include params 11728 1726882192.92922: we have included files to process 11728 1726882192.92923: generating all_blocks data 11728 1726882192.92925: done generating all_blocks data 11728 1726882192.92929: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.92930: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.92932: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93122: in VariableManager get_vars() 11728 1726882192.93143: done with get_vars() 11728 1726882192.93397: done processing included file 11728 1726882192.93399: iterating over new_blocks loaded from include file 11728 1726882192.93400: in VariableManager get_vars() 11728 1726882192.93414: done with get_vars() 11728 1726882192.93415: filtering new block on tags 11728 1726882192.93460: done filtering new block on tags 11728 1726882192.93463: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 11728 1726882192.93467: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93468: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93470: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93555: in VariableManager get_vars() 11728 1726882192.93576: done with get_vars() 11728 1726882192.93817: done processing included file 11728 1726882192.93819: iterating over new_blocks loaded from include file 11728 1726882192.93821: in VariableManager get_vars() 11728 1726882192.93908: done with get_vars() 11728 1726882192.93911: filtering new block on tags 11728 1726882192.93965: done filtering new block on tags 11728 1726882192.93968: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 11728 1726882192.93972: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93973: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.93975: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11728 1726882192.94080: in VariableManager get_vars() 11728 1726882192.94104: done with get_vars() 11728 1726882192.94338: done processing included file 11728 1726882192.94339: iterating over new_blocks loaded from include file 11728 1726882192.94341: in VariableManager get_vars() 11728 1726882192.94355: done with get_vars() 11728 1726882192.94357: filtering new block on tags 11728 1726882192.94414: done filtering new block on tags 11728 1726882192.94417: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 11728 1726882192.94421: extending task lists for all hosts with included blocks 11728 1726882192.94533: done extending task lists 11728 1726882192.94534: done processing included files 11728 1726882192.94535: results queue empty 11728 1726882192.94536: checking for any_errors_fatal 11728 1726882192.94539: done checking for any_errors_fatal 11728 1726882192.94539: checking for max_fail_percentage 11728 1726882192.94540: done checking for max_fail_percentage 11728 1726882192.94541: checking to see if all hosts have failed and the running result is not ok 11728 1726882192.94542: done checking to see if all hosts have failed 11728 1726882192.94542: getting the remaining hosts for this loop 11728 1726882192.94544: done getting the remaining hosts for this loop 11728 1726882192.94546: getting the next task for host managed_node3 11728 1726882192.94550: done getting next task for host managed_node3 11728 1726882192.94552: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11728 1726882192.94555: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882192.94557: getting variables 11728 1726882192.94558: in VariableManager get_vars() 11728 1726882192.94567: Calling all_inventory to load vars for managed_node3 11728 1726882192.94569: Calling groups_inventory to load vars for managed_node3 11728 1726882192.94571: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.94577: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.94579: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.94583: Calling groups_plugins_play to load vars for managed_node3 11728 1726882192.95763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882192.97438: done with get_vars() 11728 1726882192.97468: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:52 -0400 (0:00:00.110) 0:00:17.827 ****** 11728 1726882192.97557: entering _queue_task() for managed_node3/include_tasks 11728 1726882192.98135: worker is 1 (out of 1 available) 11728 1726882192.98148: exiting _queue_task() for managed_node3/include_tasks 11728 1726882192.98160: done queuing things up, now waiting for results queue to drain 11728 1726882192.98161: waiting for pending results... 11728 1726882192.98368: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11728 1726882192.98518: in run() - task 12673a56-9f93-5c28-a762-0000000004d9 11728 1726882192.98542: variable 'ansible_search_path' from source: unknown 11728 1726882192.98551: variable 'ansible_search_path' from source: unknown 11728 1726882192.98599: calling self._execute() 11728 1726882192.98713: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882192.98730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882192.98901: variable 'omit' from source: magic vars 11728 1726882192.99161: variable 'ansible_distribution_major_version' from source: facts 11728 1726882192.99180: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882192.99190: _execute() done 11728 1726882192.99203: dumping result to json 11728 1726882192.99212: done dumping result, returning 11728 1726882192.99225: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-5c28-a762-0000000004d9] 11728 1726882192.99241: sending task result for task 12673a56-9f93-5c28-a762-0000000004d9 11728 1726882192.99427: no more pending results, returning what we have 11728 1726882192.99434: in VariableManager get_vars() 11728 1726882192.99479: Calling all_inventory to load vars for managed_node3 11728 1726882192.99482: Calling groups_inventory to load vars for managed_node3 11728 1726882192.99485: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882192.99504: Calling all_plugins_play to load vars for managed_node3 11728 1726882192.99507: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882192.99511: Calling groups_plugins_play to load vars for managed_node3 11728 1726882193.00108: done sending task result for task 12673a56-9f93-5c28-a762-0000000004d9 11728 1726882193.00112: WORKER PROCESS EXITING 11728 1726882193.02220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882193.04811: done with get_vars() 11728 1726882193.04841: variable 'ansible_search_path' from source: unknown 11728 1726882193.04843: variable 'ansible_search_path' from source: unknown 11728 1726882193.04884: we have included files to process 11728 1726882193.04885: generating all_blocks data 11728 1726882193.04887: done generating all_blocks data 11728 1726882193.04888: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882193.04889: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882193.04891: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882193.06103: done processing included file 11728 1726882193.06105: iterating over new_blocks loaded from include file 11728 1726882193.06107: in VariableManager get_vars() 11728 1726882193.06124: done with get_vars() 11728 1726882193.06126: filtering new block on tags 11728 1726882193.06243: done filtering new block on tags 11728 1726882193.06247: in VariableManager get_vars() 11728 1726882193.06261: done with get_vars() 11728 1726882193.06262: filtering new block on tags 11728 1726882193.06320: done filtering new block on tags 11728 1726882193.06323: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11728 1726882193.06327: extending task lists for all hosts with included blocks 11728 1726882193.06673: done extending task lists 11728 1726882193.06675: done processing included files 11728 1726882193.06676: results queue empty 11728 1726882193.06676: checking for any_errors_fatal 11728 1726882193.06680: done checking for any_errors_fatal 11728 1726882193.06681: checking for max_fail_percentage 11728 1726882193.06682: done checking for max_fail_percentage 11728 1726882193.06683: checking to see if all hosts have failed and the running result is not ok 11728 1726882193.06684: done checking to see if all hosts have failed 11728 1726882193.06684: getting the remaining hosts for this loop 11728 1726882193.06686: done getting the remaining hosts for this loop 11728 1726882193.06688: getting the next task for host managed_node3 11728 1726882193.06698: done getting next task for host managed_node3 11728 1726882193.06700: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882193.06704: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882193.06707: getting variables 11728 1726882193.06708: in VariableManager get_vars() 11728 1726882193.06717: Calling all_inventory to load vars for managed_node3 11728 1726882193.06719: Calling groups_inventory to load vars for managed_node3 11728 1726882193.06721: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882193.06727: Calling all_plugins_play to load vars for managed_node3 11728 1726882193.06729: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882193.06732: Calling groups_plugins_play to load vars for managed_node3 11728 1726882193.07937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882193.09471: done with get_vars() 11728 1726882193.09504: done getting variables 11728 1726882193.09550: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:53 -0400 (0:00:00.120) 0:00:17.948 ****** 11728 1726882193.09586: entering _queue_task() for managed_node3/set_fact 11728 1726882193.09939: worker is 1 (out of 1 available) 11728 1726882193.09950: exiting _queue_task() for managed_node3/set_fact 11728 1726882193.09961: done queuing things up, now waiting for results queue to drain 11728 1726882193.09962: waiting for pending results... 11728 1726882193.10544: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882193.10678: in run() - task 12673a56-9f93-5c28-a762-0000000004fc 11728 1726882193.11099: variable 'ansible_search_path' from source: unknown 11728 1726882193.11103: variable 'ansible_search_path' from source: unknown 11728 1726882193.11106: calling self._execute() 11728 1726882193.11109: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.11111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.11115: variable 'omit' from source: magic vars 11728 1726882193.11803: variable 'ansible_distribution_major_version' from source: facts 11728 1726882193.12098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882193.12102: variable 'omit' from source: magic vars 11728 1726882193.12104: variable 'omit' from source: magic vars 11728 1726882193.12123: variable 'omit' from source: magic vars 11728 1726882193.12165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882193.12207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882193.12498: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882193.12502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.12505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.12507: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882193.12509: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.12511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.12590: Set connection var ansible_connection to ssh 11728 1726882193.12898: Set connection var ansible_shell_executable to /bin/sh 11728 1726882193.12901: Set connection var ansible_timeout to 10 11728 1726882193.12904: Set connection var ansible_shell_type to sh 11728 1726882193.12906: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882193.12908: Set connection var ansible_pipelining to False 11728 1726882193.12910: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.12912: variable 'ansible_connection' from source: unknown 11728 1726882193.12914: variable 'ansible_module_compression' from source: unknown 11728 1726882193.12916: variable 'ansible_shell_type' from source: unknown 11728 1726882193.12918: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.12920: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.12926: variable 'ansible_pipelining' from source: unknown 11728 1726882193.12928: variable 'ansible_timeout' from source: unknown 11728 1726882193.12931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.13049: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882193.13398: variable 'omit' from source: magic vars 11728 1726882193.13401: starting attempt loop 11728 1726882193.13403: running the handler 11728 1726882193.13406: handler run complete 11728 1726882193.13408: attempt loop complete, returning result 11728 1726882193.13410: _execute() done 11728 1726882193.13411: dumping result to json 11728 1726882193.13414: done dumping result, returning 11728 1726882193.13416: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-5c28-a762-0000000004fc] 11728 1726882193.13418: sending task result for task 12673a56-9f93-5c28-a762-0000000004fc 11728 1726882193.13498: done sending task result for task 12673a56-9f93-5c28-a762-0000000004fc 11728 1726882193.13502: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11728 1726882193.13559: no more pending results, returning what we have 11728 1726882193.13563: results queue empty 11728 1726882193.13564: checking for any_errors_fatal 11728 1726882193.13566: done checking for any_errors_fatal 11728 1726882193.13567: checking for max_fail_percentage 11728 1726882193.13569: done checking for max_fail_percentage 11728 1726882193.13570: checking to see if all hosts have failed and the running result is not ok 11728 1726882193.13570: done checking to see if all hosts have failed 11728 1726882193.13571: getting the remaining hosts for this loop 11728 1726882193.13573: done getting the remaining hosts for this loop 11728 1726882193.13576: getting the next task for host managed_node3 11728 1726882193.13586: done getting next task for host managed_node3 11728 1726882193.13589: ^ task is: TASK: Stat profile file 11728 1726882193.13598: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882193.13602: getting variables 11728 1726882193.13603: in VariableManager get_vars() 11728 1726882193.13639: Calling all_inventory to load vars for managed_node3 11728 1726882193.13642: Calling groups_inventory to load vars for managed_node3 11728 1726882193.13646: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882193.13657: Calling all_plugins_play to load vars for managed_node3 11728 1726882193.13660: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882193.13663: Calling groups_plugins_play to load vars for managed_node3 11728 1726882193.16847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882193.18639: done with get_vars() 11728 1726882193.18662: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:53 -0400 (0:00:00.091) 0:00:18.040 ****** 11728 1726882193.18772: entering _queue_task() for managed_node3/stat 11728 1726882193.19119: worker is 1 (out of 1 available) 11728 1726882193.19132: exiting _queue_task() for managed_node3/stat 11728 1726882193.19146: done queuing things up, now waiting for results queue to drain 11728 1726882193.19147: waiting for pending results... 11728 1726882193.19429: running TaskExecutor() for managed_node3/TASK: Stat profile file 11728 1726882193.19602: in run() - task 12673a56-9f93-5c28-a762-0000000004fd 11728 1726882193.19606: variable 'ansible_search_path' from source: unknown 11728 1726882193.19608: variable 'ansible_search_path' from source: unknown 11728 1726882193.19640: calling self._execute() 11728 1726882193.19738: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.19749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.19798: variable 'omit' from source: magic vars 11728 1726882193.20143: variable 'ansible_distribution_major_version' from source: facts 11728 1726882193.20165: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882193.20175: variable 'omit' from source: magic vars 11728 1726882193.20243: variable 'omit' from source: magic vars 11728 1726882193.20345: variable 'profile' from source: include params 11728 1726882193.20356: variable 'bond_port_profile' from source: include params 11728 1726882193.20598: variable 'bond_port_profile' from source: include params 11728 1726882193.20601: variable 'omit' from source: magic vars 11728 1726882193.20604: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882193.20606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882193.20608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882193.20610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.20612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.20619: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882193.20627: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.20634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.20736: Set connection var ansible_connection to ssh 11728 1726882193.20753: Set connection var ansible_shell_executable to /bin/sh 11728 1726882193.20764: Set connection var ansible_timeout to 10 11728 1726882193.20771: Set connection var ansible_shell_type to sh 11728 1726882193.20783: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882193.20796: Set connection var ansible_pipelining to False 11728 1726882193.20825: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.20838: variable 'ansible_connection' from source: unknown 11728 1726882193.20846: variable 'ansible_module_compression' from source: unknown 11728 1726882193.20853: variable 'ansible_shell_type' from source: unknown 11728 1726882193.20860: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.20867: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.20876: variable 'ansible_pipelining' from source: unknown 11728 1726882193.20883: variable 'ansible_timeout' from source: unknown 11728 1726882193.20890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.21089: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882193.21109: variable 'omit' from source: magic vars 11728 1726882193.21120: starting attempt loop 11728 1726882193.21128: running the handler 11728 1726882193.21161: _low_level_execute_command(): starting 11728 1726882193.21164: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882193.21906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882193.21926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.22011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.22054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.22074: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.22099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.22277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.24375: stdout chunk (state=3): >>>/root <<< 11728 1726882193.24379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.24382: stdout chunk (state=3): >>><<< 11728 1726882193.24384: stderr chunk (state=3): >>><<< 11728 1726882193.24388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.24391: _low_level_execute_command(): starting 11728 1726882193.24401: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288 `" && echo ansible-tmp-1726882193.2427008-12627-229711905244288="` echo /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288 `" ) && sleep 0' 11728 1726882193.25683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882193.25700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.25705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.25844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.25847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.25983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.26115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.27986: stdout chunk (state=3): >>>ansible-tmp-1726882193.2427008-12627-229711905244288=/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288 <<< 11728 1726882193.28087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.28151: stderr chunk (state=3): >>><<< 11728 1726882193.28155: stdout chunk (state=3): >>><<< 11728 1726882193.28370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882193.2427008-12627-229711905244288=/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.28373: variable 'ansible_module_compression' from source: unknown 11728 1726882193.28739: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882193.28743: variable 'ansible_facts' from source: unknown 11728 1726882193.28764: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py 11728 1726882193.29131: Sending initial data 11728 1726882193.29141: Sent initial data (153 bytes) 11728 1726882193.30442: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.30544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.30557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.30681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.30724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.32286: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882193.32321: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882193.32547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt3s2mudh /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py <<< 11728 1726882193.32551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py" <<< 11728 1726882193.32583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt3s2mudh" to remote "/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py" <<< 11728 1726882193.32625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py" <<< 11728 1726882193.33886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.33956: stderr chunk (state=3): >>><<< 11728 1726882193.33966: stdout chunk (state=3): >>><<< 11728 1726882193.33992: done transferring module to remote 11728 1726882193.34013: _low_level_execute_command(): starting 11728 1726882193.34058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/ /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py && sleep 0' 11728 1726882193.35271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882193.35284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.35410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.35581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.35597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.35626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.35708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.37578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.37592: stdout chunk (state=3): >>><<< 11728 1726882193.37610: stderr chunk (state=3): >>><<< 11728 1726882193.37648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.37819: _low_level_execute_command(): starting 11728 1726882193.37823: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/AnsiballZ_stat.py && sleep 0' 11728 1726882193.38999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.39003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882193.39005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882193.39007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.39010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.39079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.39085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.39266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.39319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.54245: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882193.55603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882193.55607: stderr chunk (state=3): >>><<< 11728 1726882193.55610: stdout chunk (state=3): >>><<< 11728 1726882193.55632: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882193.55660: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882193.55669: _low_level_execute_command(): starting 11728 1726882193.55674: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882193.2427008-12627-229711905244288/ > /dev/null 2>&1 && sleep 0' 11728 1726882193.57016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882193.57201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.57249: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.57371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.59568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.59580: stdout chunk (state=3): >>><<< 11728 1726882193.59583: stderr chunk (state=3): >>><<< 11728 1726882193.59601: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.59604: handler run complete 11728 1726882193.59629: attempt loop complete, returning result 11728 1726882193.59632: _execute() done 11728 1726882193.59635: dumping result to json 11728 1726882193.59637: done dumping result, returning 11728 1726882193.59646: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-5c28-a762-0000000004fd] 11728 1726882193.59651: sending task result for task 12673a56-9f93-5c28-a762-0000000004fd 11728 1726882193.59759: done sending task result for task 12673a56-9f93-5c28-a762-0000000004fd 11728 1726882193.59762: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11728 1726882193.59827: no more pending results, returning what we have 11728 1726882193.59831: results queue empty 11728 1726882193.59832: checking for any_errors_fatal 11728 1726882193.59839: done checking for any_errors_fatal 11728 1726882193.59839: checking for max_fail_percentage 11728 1726882193.59841: done checking for max_fail_percentage 11728 1726882193.59842: checking to see if all hosts have failed and the running result is not ok 11728 1726882193.59843: done checking to see if all hosts have failed 11728 1726882193.59843: getting the remaining hosts for this loop 11728 1726882193.59845: done getting the remaining hosts for this loop 11728 1726882193.59849: getting the next task for host managed_node3 11728 1726882193.59858: done getting next task for host managed_node3 11728 1726882193.59861: ^ task is: TASK: Set NM profile exist flag based on the profile files 11728 1726882193.59867: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882193.59901: getting variables 11728 1726882193.59903: in VariableManager get_vars() 11728 1726882193.59938: Calling all_inventory to load vars for managed_node3 11728 1726882193.59941: Calling groups_inventory to load vars for managed_node3 11728 1726882193.59945: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882193.59956: Calling all_plugins_play to load vars for managed_node3 11728 1726882193.59959: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882193.59962: Calling groups_plugins_play to load vars for managed_node3 11728 1726882193.63720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882193.66564: done with get_vars() 11728 1726882193.66612: done getting variables 11728 1726882193.66740: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:53 -0400 (0:00:00.480) 0:00:18.520 ****** 11728 1726882193.66782: entering _queue_task() for managed_node3/set_fact 11728 1726882193.67399: worker is 1 (out of 1 available) 11728 1726882193.67412: exiting _queue_task() for managed_node3/set_fact 11728 1726882193.67423: done queuing things up, now waiting for results queue to drain 11728 1726882193.67425: waiting for pending results... 11728 1726882193.67811: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11728 1726882193.67864: in run() - task 12673a56-9f93-5c28-a762-0000000004fe 11728 1726882193.67878: variable 'ansible_search_path' from source: unknown 11728 1726882193.67881: variable 'ansible_search_path' from source: unknown 11728 1726882193.67927: calling self._execute() 11728 1726882193.68050: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.68053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.68065: variable 'omit' from source: magic vars 11728 1726882193.68463: variable 'ansible_distribution_major_version' from source: facts 11728 1726882193.68474: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882193.68698: variable 'profile_stat' from source: set_fact 11728 1726882193.68701: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882193.68703: when evaluation is False, skipping this task 11728 1726882193.68705: _execute() done 11728 1726882193.68707: dumping result to json 11728 1726882193.68709: done dumping result, returning 11728 1726882193.68711: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-5c28-a762-0000000004fe] 11728 1726882193.68713: sending task result for task 12673a56-9f93-5c28-a762-0000000004fe 11728 1726882193.68776: done sending task result for task 12673a56-9f93-5c28-a762-0000000004fe 11728 1726882193.68779: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882193.68831: no more pending results, returning what we have 11728 1726882193.68837: results queue empty 11728 1726882193.68838: checking for any_errors_fatal 11728 1726882193.68845: done checking for any_errors_fatal 11728 1726882193.68846: checking for max_fail_percentage 11728 1726882193.68848: done checking for max_fail_percentage 11728 1726882193.68849: checking to see if all hosts have failed and the running result is not ok 11728 1726882193.68850: done checking to see if all hosts have failed 11728 1726882193.68850: getting the remaining hosts for this loop 11728 1726882193.68852: done getting the remaining hosts for this loop 11728 1726882193.68855: getting the next task for host managed_node3 11728 1726882193.68864: done getting next task for host managed_node3 11728 1726882193.68866: ^ task is: TASK: Get NM profile info 11728 1726882193.68877: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882193.68881: getting variables 11728 1726882193.68883: in VariableManager get_vars() 11728 1726882193.68919: Calling all_inventory to load vars for managed_node3 11728 1726882193.68922: Calling groups_inventory to load vars for managed_node3 11728 1726882193.68926: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882193.68938: Calling all_plugins_play to load vars for managed_node3 11728 1726882193.68941: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882193.68944: Calling groups_plugins_play to load vars for managed_node3 11728 1726882193.71381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882193.74424: done with get_vars() 11728 1726882193.74449: done getting variables 11728 1726882193.74549: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:53 -0400 (0:00:00.079) 0:00:18.599 ****** 11728 1726882193.74703: entering _queue_task() for managed_node3/shell 11728 1726882193.75288: worker is 1 (out of 1 available) 11728 1726882193.75406: exiting _queue_task() for managed_node3/shell 11728 1726882193.75417: done queuing things up, now waiting for results queue to drain 11728 1726882193.75419: waiting for pending results... 11728 1726882193.75767: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11728 1726882193.76099: in run() - task 12673a56-9f93-5c28-a762-0000000004ff 11728 1726882193.76300: variable 'ansible_search_path' from source: unknown 11728 1726882193.76303: variable 'ansible_search_path' from source: unknown 11728 1726882193.76306: calling self._execute() 11728 1726882193.76308: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.76312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.76315: variable 'omit' from source: magic vars 11728 1726882193.77047: variable 'ansible_distribution_major_version' from source: facts 11728 1726882193.77313: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882193.77499: variable 'omit' from source: magic vars 11728 1726882193.77502: variable 'omit' from source: magic vars 11728 1726882193.77504: variable 'profile' from source: include params 11728 1726882193.77507: variable 'bond_port_profile' from source: include params 11728 1726882193.77757: variable 'bond_port_profile' from source: include params 11728 1726882193.77783: variable 'omit' from source: magic vars 11728 1726882193.77831: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882193.77871: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882193.77901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882193.78298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.78302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882193.78305: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882193.78308: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.78311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.78314: Set connection var ansible_connection to ssh 11728 1726882193.78317: Set connection var ansible_shell_executable to /bin/sh 11728 1726882193.78320: Set connection var ansible_timeout to 10 11728 1726882193.78322: Set connection var ansible_shell_type to sh 11728 1726882193.78325: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882193.78328: Set connection var ansible_pipelining to False 11728 1726882193.78513: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.78522: variable 'ansible_connection' from source: unknown 11728 1726882193.78529: variable 'ansible_module_compression' from source: unknown 11728 1726882193.78537: variable 'ansible_shell_type' from source: unknown 11728 1726882193.78544: variable 'ansible_shell_executable' from source: unknown 11728 1726882193.78551: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882193.78560: variable 'ansible_pipelining' from source: unknown 11728 1726882193.78567: variable 'ansible_timeout' from source: unknown 11728 1726882193.78575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882193.79098: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882193.79102: variable 'omit' from source: magic vars 11728 1726882193.79105: starting attempt loop 11728 1726882193.79108: running the handler 11728 1726882193.79110: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882193.79113: _low_level_execute_command(): starting 11728 1726882193.79115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882193.80444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.80461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.80474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.80643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.80655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.80725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.80791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.82405: stdout chunk (state=3): >>>/root <<< 11728 1726882193.82505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.82542: stderr chunk (state=3): >>><<< 11728 1726882193.82553: stdout chunk (state=3): >>><<< 11728 1726882193.82725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.82745: _low_level_execute_command(): starting 11728 1726882193.82755: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944 `" && echo ansible-tmp-1726882193.8273218-12658-29622622387944="` echo /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944 `" ) && sleep 0' 11728 1726882193.84110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.84322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.86091: stdout chunk (state=3): >>>ansible-tmp-1726882193.8273218-12658-29622622387944=/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944 <<< 11728 1726882193.86259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.86289: stderr chunk (state=3): >>><<< 11728 1726882193.86301: stdout chunk (state=3): >>><<< 11728 1726882193.86323: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882193.8273218-12658-29622622387944=/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.86362: variable 'ansible_module_compression' from source: unknown 11728 1726882193.86554: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882193.86599: variable 'ansible_facts' from source: unknown 11728 1726882193.86682: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py 11728 1726882193.87321: Sending initial data 11728 1726882193.87324: Sent initial data (155 bytes) 11728 1726882193.88475: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.88479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.88482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.88484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.88857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.90719: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp4a58k2z2" to remote "/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py" <<< 11728 1726882193.90723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp4a58k2z2 /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py <<< 11728 1726882193.92023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.92028: stdout chunk (state=3): >>><<< 11728 1726882193.92030: stderr chunk (state=3): >>><<< 11728 1726882193.92033: done transferring module to remote 11728 1726882193.92035: _low_level_execute_command(): starting 11728 1726882193.92037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/ /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py && sleep 0' 11728 1726882193.93129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882193.93138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.93148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.93162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882193.93174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882193.93181: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882193.93191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.93210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882193.93219: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882193.93229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882193.93234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.93244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882193.93256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882193.93264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882193.93271: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882193.93280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882193.93452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.93613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.93632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.93706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882193.95559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882193.95563: stdout chunk (state=3): >>><<< 11728 1726882193.95570: stderr chunk (state=3): >>><<< 11728 1726882193.95659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882193.95662: _low_level_execute_command(): starting 11728 1726882193.95665: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/AnsiballZ_command.py && sleep 0' 11728 1726882193.96902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882193.96905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882193.96918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882193.96938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882193.96963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882193.97045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882194.13877: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:54.117185", "end": "2024-09-20 21:29:54.137098", "delta": "0:00:00.019913", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882194.15466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882194.15470: stdout chunk (state=3): >>><<< 11728 1726882194.15472: stderr chunk (state=3): >>><<< 11728 1726882194.15491: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:54.117185", "end": "2024-09-20 21:29:54.137098", "delta": "0:00:00.019913", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882194.15538: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882194.15617: _low_level_execute_command(): starting 11728 1726882194.15709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882193.8273218-12658-29622622387944/ > /dev/null 2>&1 && sleep 0' 11728 1726882194.16707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882194.16911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882194.16986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882194.17111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882194.17155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882194.19652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882194.19665: stdout chunk (state=3): >>><<< 11728 1726882194.19678: stderr chunk (state=3): >>><<< 11728 1726882194.19708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882194.19909: handler run complete 11728 1726882194.19913: Evaluated conditional (False): False 11728 1726882194.19915: attempt loop complete, returning result 11728 1726882194.19917: _execute() done 11728 1726882194.19919: dumping result to json 11728 1726882194.19921: done dumping result, returning 11728 1726882194.19923: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-5c28-a762-0000000004ff] 11728 1726882194.19925: sending task result for task 12673a56-9f93-5c28-a762-0000000004ff ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.019913", "end": "2024-09-20 21:29:54.137098", "rc": 0, "start": "2024-09-20 21:29:54.117185" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11728 1726882194.20087: no more pending results, returning what we have 11728 1726882194.20091: results queue empty 11728 1726882194.20092: checking for any_errors_fatal 11728 1726882194.20102: done checking for any_errors_fatal 11728 1726882194.20103: checking for max_fail_percentage 11728 1726882194.20105: done checking for max_fail_percentage 11728 1726882194.20106: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.20107: done checking to see if all hosts have failed 11728 1726882194.20107: getting the remaining hosts for this loop 11728 1726882194.20109: done getting the remaining hosts for this loop 11728 1726882194.20113: getting the next task for host managed_node3 11728 1726882194.20201: done getting next task for host managed_node3 11728 1726882194.20205: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882194.20211: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.20216: getting variables 11728 1726882194.20218: in VariableManager get_vars() 11728 1726882194.20426: Calling all_inventory to load vars for managed_node3 11728 1726882194.20429: Calling groups_inventory to load vars for managed_node3 11728 1726882194.20433: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.20559: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.20563: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.20567: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.21168: done sending task result for task 12673a56-9f93-5c28-a762-0000000004ff 11728 1726882194.21172: WORKER PROCESS EXITING 11728 1726882194.23473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.26874: done with get_vars() 11728 1726882194.26908: done getting variables 11728 1726882194.27060: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:54 -0400 (0:00:00.523) 0:00:19.123 ****** 11728 1726882194.27248: entering _queue_task() for managed_node3/set_fact 11728 1726882194.27841: worker is 1 (out of 1 available) 11728 1726882194.27854: exiting _queue_task() for managed_node3/set_fact 11728 1726882194.27866: done queuing things up, now waiting for results queue to drain 11728 1726882194.27867: waiting for pending results... 11728 1726882194.28378: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882194.28708: in run() - task 12673a56-9f93-5c28-a762-000000000500 11728 1726882194.28724: variable 'ansible_search_path' from source: unknown 11728 1726882194.28728: variable 'ansible_search_path' from source: unknown 11728 1726882194.28765: calling self._execute() 11728 1726882194.28863: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.28868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.28877: variable 'omit' from source: magic vars 11728 1726882194.29680: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.30099: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.30103: variable 'nm_profile_exists' from source: set_fact 11728 1726882194.30106: Evaluated conditional (nm_profile_exists.rc == 0): True 11728 1726882194.30108: variable 'omit' from source: magic vars 11728 1726882194.30400: variable 'omit' from source: magic vars 11728 1726882194.30404: variable 'omit' from source: magic vars 11728 1726882194.30412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882194.30455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882194.30480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882194.30803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.30807: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.30809: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882194.30812: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.30814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.30877: Set connection var ansible_connection to ssh 11728 1726882194.30896: Set connection var ansible_shell_executable to /bin/sh 11728 1726882194.30910: Set connection var ansible_timeout to 10 11728 1726882194.30917: Set connection var ansible_shell_type to sh 11728 1726882194.30931: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882194.30941: Set connection var ansible_pipelining to False 11728 1726882194.31025: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.31033: variable 'ansible_connection' from source: unknown 11728 1726882194.31040: variable 'ansible_module_compression' from source: unknown 11728 1726882194.31048: variable 'ansible_shell_type' from source: unknown 11728 1726882194.31055: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.31062: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.31071: variable 'ansible_pipelining' from source: unknown 11728 1726882194.31081: variable 'ansible_timeout' from source: unknown 11728 1726882194.31157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.31339: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882194.31350: variable 'omit' from source: magic vars 11728 1726882194.31355: starting attempt loop 11728 1726882194.31358: running the handler 11728 1726882194.31377: handler run complete 11728 1726882194.31388: attempt loop complete, returning result 11728 1726882194.31391: _execute() done 11728 1726882194.31398: dumping result to json 11728 1726882194.31400: done dumping result, returning 11728 1726882194.31407: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-5c28-a762-000000000500] 11728 1726882194.31414: sending task result for task 12673a56-9f93-5c28-a762-000000000500 11728 1726882194.31504: done sending task result for task 12673a56-9f93-5c28-a762-000000000500 11728 1726882194.31507: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11728 1726882194.31569: no more pending results, returning what we have 11728 1726882194.31573: results queue empty 11728 1726882194.31574: checking for any_errors_fatal 11728 1726882194.31582: done checking for any_errors_fatal 11728 1726882194.31582: checking for max_fail_percentage 11728 1726882194.31584: done checking for max_fail_percentage 11728 1726882194.31585: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.31586: done checking to see if all hosts have failed 11728 1726882194.31587: getting the remaining hosts for this loop 11728 1726882194.31589: done getting the remaining hosts for this loop 11728 1726882194.31592: getting the next task for host managed_node3 11728 1726882194.31605: done getting next task for host managed_node3 11728 1726882194.31607: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882194.31613: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.31616: getting variables 11728 1726882194.31617: in VariableManager get_vars() 11728 1726882194.31649: Calling all_inventory to load vars for managed_node3 11728 1726882194.31652: Calling groups_inventory to load vars for managed_node3 11728 1726882194.31655: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.31665: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.31667: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.31670: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.33603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.36032: done with get_vars() 11728 1726882194.36067: done getting variables 11728 1726882194.36129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.36252: variable 'profile' from source: include params 11728 1726882194.36256: variable 'bond_port_profile' from source: include params 11728 1726882194.36325: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:54 -0400 (0:00:00.092) 0:00:19.216 ****** 11728 1726882194.36359: entering _queue_task() for managed_node3/command 11728 1726882194.36764: worker is 1 (out of 1 available) 11728 1726882194.36775: exiting _queue_task() for managed_node3/command 11728 1726882194.36786: done queuing things up, now waiting for results queue to drain 11728 1726882194.36787: waiting for pending results... 11728 1726882194.37822: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 11728 1726882194.37827: in run() - task 12673a56-9f93-5c28-a762-000000000502 11728 1726882194.37829: variable 'ansible_search_path' from source: unknown 11728 1726882194.37832: variable 'ansible_search_path' from source: unknown 11728 1726882194.38401: calling self._execute() 11728 1726882194.38405: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.38407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.38410: variable 'omit' from source: magic vars 11728 1726882194.39212: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.39500: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.39538: variable 'profile_stat' from source: set_fact 11728 1726882194.39555: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882194.39562: when evaluation is False, skipping this task 11728 1726882194.39570: _execute() done 11728 1726882194.39578: dumping result to json 11728 1726882194.39585: done dumping result, returning 11728 1726882194.39900: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-5c28-a762-000000000502] 11728 1726882194.39903: sending task result for task 12673a56-9f93-5c28-a762-000000000502 11728 1726882194.39968: done sending task result for task 12673a56-9f93-5c28-a762-000000000502 11728 1726882194.39972: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882194.40024: no more pending results, returning what we have 11728 1726882194.40028: results queue empty 11728 1726882194.40029: checking for any_errors_fatal 11728 1726882194.40036: done checking for any_errors_fatal 11728 1726882194.40037: checking for max_fail_percentage 11728 1726882194.40039: done checking for max_fail_percentage 11728 1726882194.40040: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.40040: done checking to see if all hosts have failed 11728 1726882194.40041: getting the remaining hosts for this loop 11728 1726882194.40043: done getting the remaining hosts for this loop 11728 1726882194.40046: getting the next task for host managed_node3 11728 1726882194.40054: done getting next task for host managed_node3 11728 1726882194.40056: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882194.40063: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.40067: getting variables 11728 1726882194.40068: in VariableManager get_vars() 11728 1726882194.40101: Calling all_inventory to load vars for managed_node3 11728 1726882194.40103: Calling groups_inventory to load vars for managed_node3 11728 1726882194.40106: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.40116: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.40118: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.40120: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.44328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.49092: done with get_vars() 11728 1726882194.49130: done getting variables 11728 1726882194.49196: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.49455: variable 'profile' from source: include params 11728 1726882194.49459: variable 'bond_port_profile' from source: include params 11728 1726882194.49523: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:54 -0400 (0:00:00.131) 0:00:19.348 ****** 11728 1726882194.49562: entering _queue_task() for managed_node3/set_fact 11728 1726882194.50483: worker is 1 (out of 1 available) 11728 1726882194.50495: exiting _queue_task() for managed_node3/set_fact 11728 1726882194.50505: done queuing things up, now waiting for results queue to drain 11728 1726882194.50506: waiting for pending results... 11728 1726882194.50964: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 11728 1726882194.51304: in run() - task 12673a56-9f93-5c28-a762-000000000503 11728 1726882194.51347: variable 'ansible_search_path' from source: unknown 11728 1726882194.51351: variable 'ansible_search_path' from source: unknown 11728 1726882194.51385: calling self._execute() 11728 1726882194.51700: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.51704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.51707: variable 'omit' from source: magic vars 11728 1726882194.52349: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.52360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.52537: variable 'profile_stat' from source: set_fact 11728 1726882194.52540: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882194.52543: when evaluation is False, skipping this task 11728 1726882194.52545: _execute() done 11728 1726882194.52547: dumping result to json 11728 1726882194.52654: done dumping result, returning 11728 1726882194.52660: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-5c28-a762-000000000503] 11728 1726882194.52667: sending task result for task 12673a56-9f93-5c28-a762-000000000503 11728 1726882194.52769: done sending task result for task 12673a56-9f93-5c28-a762-000000000503 11728 1726882194.52774: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882194.52829: no more pending results, returning what we have 11728 1726882194.52834: results queue empty 11728 1726882194.52836: checking for any_errors_fatal 11728 1726882194.52842: done checking for any_errors_fatal 11728 1726882194.52843: checking for max_fail_percentage 11728 1726882194.52845: done checking for max_fail_percentage 11728 1726882194.52846: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.52847: done checking to see if all hosts have failed 11728 1726882194.52848: getting the remaining hosts for this loop 11728 1726882194.52850: done getting the remaining hosts for this loop 11728 1726882194.52853: getting the next task for host managed_node3 11728 1726882194.52863: done getting next task for host managed_node3 11728 1726882194.52866: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11728 1726882194.52873: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.52877: getting variables 11728 1726882194.52878: in VariableManager get_vars() 11728 1726882194.52914: Calling all_inventory to load vars for managed_node3 11728 1726882194.52917: Calling groups_inventory to load vars for managed_node3 11728 1726882194.52921: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.52934: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.52936: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.52940: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.56177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.59475: done with get_vars() 11728 1726882194.59507: done getting variables 11728 1726882194.59660: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.59930: variable 'profile' from source: include params 11728 1726882194.59934: variable 'bond_port_profile' from source: include params 11728 1726882194.60048: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:54 -0400 (0:00:00.106) 0:00:19.454 ****** 11728 1726882194.60200: entering _queue_task() for managed_node3/command 11728 1726882194.60862: worker is 1 (out of 1 available) 11728 1726882194.60875: exiting _queue_task() for managed_node3/command 11728 1726882194.60887: done queuing things up, now waiting for results queue to drain 11728 1726882194.60888: waiting for pending results... 11728 1726882194.61718: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 11728 1726882194.62166: in run() - task 12673a56-9f93-5c28-a762-000000000504 11728 1726882194.62170: variable 'ansible_search_path' from source: unknown 11728 1726882194.62174: variable 'ansible_search_path' from source: unknown 11728 1726882194.62179: calling self._execute() 11728 1726882194.62492: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.62501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.62708: variable 'omit' from source: magic vars 11728 1726882194.63522: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.63534: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.63772: variable 'profile_stat' from source: set_fact 11728 1726882194.63783: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882194.63787: when evaluation is False, skipping this task 11728 1726882194.63789: _execute() done 11728 1726882194.63792: dumping result to json 11728 1726882194.63801: done dumping result, returning 11728 1726882194.63911: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [12673a56-9f93-5c28-a762-000000000504] 11728 1726882194.64018: sending task result for task 12673a56-9f93-5c28-a762-000000000504 11728 1726882194.64087: done sending task result for task 12673a56-9f93-5c28-a762-000000000504 11728 1726882194.64091: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882194.64146: no more pending results, returning what we have 11728 1726882194.64151: results queue empty 11728 1726882194.64152: checking for any_errors_fatal 11728 1726882194.64158: done checking for any_errors_fatal 11728 1726882194.64159: checking for max_fail_percentage 11728 1726882194.64161: done checking for max_fail_percentage 11728 1726882194.64162: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.64162: done checking to see if all hosts have failed 11728 1726882194.64163: getting the remaining hosts for this loop 11728 1726882194.64165: done getting the remaining hosts for this loop 11728 1726882194.64169: getting the next task for host managed_node3 11728 1726882194.64177: done getting next task for host managed_node3 11728 1726882194.64180: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11728 1726882194.64186: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.64191: getting variables 11728 1726882194.64194: in VariableManager get_vars() 11728 1726882194.64230: Calling all_inventory to load vars for managed_node3 11728 1726882194.64233: Calling groups_inventory to load vars for managed_node3 11728 1726882194.64236: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.64250: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.64253: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.64256: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.67418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.70891: done with get_vars() 11728 1726882194.70962: done getting variables 11728 1726882194.71027: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.71256: variable 'profile' from source: include params 11728 1726882194.71260: variable 'bond_port_profile' from source: include params 11728 1726882194.71455: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:54 -0400 (0:00:00.112) 0:00:19.567 ****** 11728 1726882194.71601: entering _queue_task() for managed_node3/set_fact 11728 1726882194.72468: worker is 1 (out of 1 available) 11728 1726882194.72479: exiting _queue_task() for managed_node3/set_fact 11728 1726882194.72490: done queuing things up, now waiting for results queue to drain 11728 1726882194.72491: waiting for pending results... 11728 1726882194.72791: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 11728 1726882194.73222: in run() - task 12673a56-9f93-5c28-a762-000000000505 11728 1726882194.73226: variable 'ansible_search_path' from source: unknown 11728 1726882194.73229: variable 'ansible_search_path' from source: unknown 11728 1726882194.73232: calling self._execute() 11728 1726882194.73482: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.73485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.73499: variable 'omit' from source: magic vars 11728 1726882194.74091: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.74098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.74127: variable 'profile_stat' from source: set_fact 11728 1726882194.74137: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882194.74140: when evaluation is False, skipping this task 11728 1726882194.74143: _execute() done 11728 1726882194.74146: dumping result to json 11728 1726882194.74148: done dumping result, returning 11728 1726882194.74156: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [12673a56-9f93-5c28-a762-000000000505] 11728 1726882194.74161: sending task result for task 12673a56-9f93-5c28-a762-000000000505 11728 1726882194.74259: done sending task result for task 12673a56-9f93-5c28-a762-000000000505 11728 1726882194.74262: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882194.74322: no more pending results, returning what we have 11728 1726882194.74327: results queue empty 11728 1726882194.74329: checking for any_errors_fatal 11728 1726882194.74338: done checking for any_errors_fatal 11728 1726882194.74339: checking for max_fail_percentage 11728 1726882194.74341: done checking for max_fail_percentage 11728 1726882194.74342: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.74343: done checking to see if all hosts have failed 11728 1726882194.74344: getting the remaining hosts for this loop 11728 1726882194.74346: done getting the remaining hosts for this loop 11728 1726882194.74349: getting the next task for host managed_node3 11728 1726882194.74360: done getting next task for host managed_node3 11728 1726882194.74363: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11728 1726882194.74368: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.74372: getting variables 11728 1726882194.74373: in VariableManager get_vars() 11728 1726882194.74407: Calling all_inventory to load vars for managed_node3 11728 1726882194.74410: Calling groups_inventory to load vars for managed_node3 11728 1726882194.74413: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.74541: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.74545: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.74549: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.76047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.77657: done with get_vars() 11728 1726882194.77680: done getting variables 11728 1726882194.77955: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.78192: variable 'profile' from source: include params 11728 1726882194.78246: variable 'bond_port_profile' from source: include params 11728 1726882194.78361: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:54 -0400 (0:00:00.069) 0:00:19.636 ****** 11728 1726882194.78397: entering _queue_task() for managed_node3/assert 11728 1726882194.79320: worker is 1 (out of 1 available) 11728 1726882194.79331: exiting _queue_task() for managed_node3/assert 11728 1726882194.79341: done queuing things up, now waiting for results queue to drain 11728 1726882194.79342: waiting for pending results... 11728 1726882194.79602: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 11728 1726882194.79608: in run() - task 12673a56-9f93-5c28-a762-0000000004da 11728 1726882194.79611: variable 'ansible_search_path' from source: unknown 11728 1726882194.79614: variable 'ansible_search_path' from source: unknown 11728 1726882194.79627: calling self._execute() 11728 1726882194.79720: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.79724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.79734: variable 'omit' from source: magic vars 11728 1726882194.80101: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.80111: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.80127: variable 'omit' from source: magic vars 11728 1726882194.80175: variable 'omit' from source: magic vars 11728 1726882194.80300: variable 'profile' from source: include params 11728 1726882194.80304: variable 'bond_port_profile' from source: include params 11728 1726882194.80347: variable 'bond_port_profile' from source: include params 11728 1726882194.80366: variable 'omit' from source: magic vars 11728 1726882194.80406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882194.80452: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882194.80500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882194.80503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.80506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.80524: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882194.80528: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.80532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.80629: Set connection var ansible_connection to ssh 11728 1726882194.80668: Set connection var ansible_shell_executable to /bin/sh 11728 1726882194.80676: Set connection var ansible_timeout to 10 11728 1726882194.80679: Set connection var ansible_shell_type to sh 11728 1726882194.80681: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882194.80685: Set connection var ansible_pipelining to False 11728 1726882194.80711: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.80713: variable 'ansible_connection' from source: unknown 11728 1726882194.80716: variable 'ansible_module_compression' from source: unknown 11728 1726882194.80718: variable 'ansible_shell_type' from source: unknown 11728 1726882194.80720: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.80722: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.80724: variable 'ansible_pipelining' from source: unknown 11728 1726882194.80726: variable 'ansible_timeout' from source: unknown 11728 1726882194.80728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.81201: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882194.81205: variable 'omit' from source: magic vars 11728 1726882194.81207: starting attempt loop 11728 1726882194.81214: running the handler 11728 1726882194.81400: variable 'lsr_net_profile_exists' from source: set_fact 11728 1726882194.81403: Evaluated conditional (lsr_net_profile_exists): True 11728 1726882194.81406: handler run complete 11728 1726882194.81408: attempt loop complete, returning result 11728 1726882194.81409: _execute() done 11728 1726882194.81411: dumping result to json 11728 1726882194.81413: done dumping result, returning 11728 1726882194.81415: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [12673a56-9f93-5c28-a762-0000000004da] 11728 1726882194.81416: sending task result for task 12673a56-9f93-5c28-a762-0000000004da 11728 1726882194.81477: done sending task result for task 12673a56-9f93-5c28-a762-0000000004da 11728 1726882194.81480: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882194.81534: no more pending results, returning what we have 11728 1726882194.81539: results queue empty 11728 1726882194.81540: checking for any_errors_fatal 11728 1726882194.81548: done checking for any_errors_fatal 11728 1726882194.81549: checking for max_fail_percentage 11728 1726882194.81551: done checking for max_fail_percentage 11728 1726882194.81552: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.81552: done checking to see if all hosts have failed 11728 1726882194.81553: getting the remaining hosts for this loop 11728 1726882194.81555: done getting the remaining hosts for this loop 11728 1726882194.81559: getting the next task for host managed_node3 11728 1726882194.81567: done getting next task for host managed_node3 11728 1726882194.81572: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11728 1726882194.81577: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.81581: getting variables 11728 1726882194.81583: in VariableManager get_vars() 11728 1726882194.81728: Calling all_inventory to load vars for managed_node3 11728 1726882194.81731: Calling groups_inventory to load vars for managed_node3 11728 1726882194.81735: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.81746: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.81749: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.81752: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.83445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.85118: done with get_vars() 11728 1726882194.85156: done getting variables 11728 1726882194.85233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.85367: variable 'profile' from source: include params 11728 1726882194.85371: variable 'bond_port_profile' from source: include params 11728 1726882194.85444: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:54 -0400 (0:00:00.070) 0:00:19.707 ****** 11728 1726882194.85481: entering _queue_task() for managed_node3/assert 11728 1726882194.85843: worker is 1 (out of 1 available) 11728 1726882194.86100: exiting _queue_task() for managed_node3/assert 11728 1726882194.86109: done queuing things up, now waiting for results queue to drain 11728 1726882194.86110: waiting for pending results... 11728 1726882194.86321: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 11728 1726882194.86326: in run() - task 12673a56-9f93-5c28-a762-0000000004db 11728 1726882194.86329: variable 'ansible_search_path' from source: unknown 11728 1726882194.86332: variable 'ansible_search_path' from source: unknown 11728 1726882194.86364: calling self._execute() 11728 1726882194.86451: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.86458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.86467: variable 'omit' from source: magic vars 11728 1726882194.87008: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.87012: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.87014: variable 'omit' from source: magic vars 11728 1726882194.87017: variable 'omit' from source: magic vars 11728 1726882194.87069: variable 'profile' from source: include params 11728 1726882194.87072: variable 'bond_port_profile' from source: include params 11728 1726882194.87147: variable 'bond_port_profile' from source: include params 11728 1726882194.87170: variable 'omit' from source: magic vars 11728 1726882194.87214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882194.87255: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882194.87272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882194.87289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.87304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.87500: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882194.87503: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.87506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.87509: Set connection var ansible_connection to ssh 11728 1726882194.87511: Set connection var ansible_shell_executable to /bin/sh 11728 1726882194.87513: Set connection var ansible_timeout to 10 11728 1726882194.87515: Set connection var ansible_shell_type to sh 11728 1726882194.87517: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882194.87520: Set connection var ansible_pipelining to False 11728 1726882194.87522: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.87524: variable 'ansible_connection' from source: unknown 11728 1726882194.87525: variable 'ansible_module_compression' from source: unknown 11728 1726882194.87529: variable 'ansible_shell_type' from source: unknown 11728 1726882194.87531: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.87533: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.87534: variable 'ansible_pipelining' from source: unknown 11728 1726882194.87537: variable 'ansible_timeout' from source: unknown 11728 1726882194.87539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.87767: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882194.87771: variable 'omit' from source: magic vars 11728 1726882194.87773: starting attempt loop 11728 1726882194.87775: running the handler 11728 1726882194.87828: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11728 1726882194.87831: Evaluated conditional (lsr_net_profile_ansible_managed): True 11728 1726882194.87838: handler run complete 11728 1726882194.87853: attempt loop complete, returning result 11728 1726882194.87857: _execute() done 11728 1726882194.87867: dumping result to json 11728 1726882194.87870: done dumping result, returning 11728 1726882194.87876: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [12673a56-9f93-5c28-a762-0000000004db] 11728 1726882194.87882: sending task result for task 12673a56-9f93-5c28-a762-0000000004db 11728 1726882194.88067: done sending task result for task 12673a56-9f93-5c28-a762-0000000004db 11728 1726882194.88076: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882194.88151: no more pending results, returning what we have 11728 1726882194.88155: results queue empty 11728 1726882194.88156: checking for any_errors_fatal 11728 1726882194.88163: done checking for any_errors_fatal 11728 1726882194.88164: checking for max_fail_percentage 11728 1726882194.88166: done checking for max_fail_percentage 11728 1726882194.88167: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.88168: done checking to see if all hosts have failed 11728 1726882194.88169: getting the remaining hosts for this loop 11728 1726882194.88170: done getting the remaining hosts for this loop 11728 1726882194.88174: getting the next task for host managed_node3 11728 1726882194.88181: done getting next task for host managed_node3 11728 1726882194.88191: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11728 1726882194.88198: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.88202: getting variables 11728 1726882194.88204: in VariableManager get_vars() 11728 1726882194.88238: Calling all_inventory to load vars for managed_node3 11728 1726882194.88242: Calling groups_inventory to load vars for managed_node3 11728 1726882194.88246: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.88256: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.88260: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.88263: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.89950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882194.91664: done with get_vars() 11728 1726882194.91687: done getting variables 11728 1726882194.91746: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882194.91876: variable 'profile' from source: include params 11728 1726882194.91879: variable 'bond_port_profile' from source: include params 11728 1726882194.91989: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:54 -0400 (0:00:00.065) 0:00:19.772 ****** 11728 1726882194.92023: entering _queue_task() for managed_node3/assert 11728 1726882194.92686: worker is 1 (out of 1 available) 11728 1726882194.92699: exiting _queue_task() for managed_node3/assert 11728 1726882194.92713: done queuing things up, now waiting for results queue to drain 11728 1726882194.92714: waiting for pending results... 11728 1726882194.93741: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 11728 1726882194.94001: in run() - task 12673a56-9f93-5c28-a762-0000000004dc 11728 1726882194.94006: variable 'ansible_search_path' from source: unknown 11728 1726882194.94009: variable 'ansible_search_path' from source: unknown 11728 1726882194.94012: calling self._execute() 11728 1726882194.94212: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.94285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.94304: variable 'omit' from source: magic vars 11728 1726882194.95254: variable 'ansible_distribution_major_version' from source: facts 11728 1726882194.95258: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882194.95262: variable 'omit' from source: magic vars 11728 1726882194.95264: variable 'omit' from source: magic vars 11728 1726882194.95490: variable 'profile' from source: include params 11728 1726882194.95507: variable 'bond_port_profile' from source: include params 11728 1726882194.95571: variable 'bond_port_profile' from source: include params 11728 1726882194.95725: variable 'omit' from source: magic vars 11728 1726882194.95769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882194.95831: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882194.95947: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882194.95969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.95985: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882194.96024: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882194.96106: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.96114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.96306: Set connection var ansible_connection to ssh 11728 1726882194.96360: Set connection var ansible_shell_executable to /bin/sh 11728 1726882194.96378: Set connection var ansible_timeout to 10 11728 1726882194.96386: Set connection var ansible_shell_type to sh 11728 1726882194.96402: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882194.96417: Set connection var ansible_pipelining to False 11728 1726882194.96468: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.96483: variable 'ansible_connection' from source: unknown 11728 1726882194.96490: variable 'ansible_module_compression' from source: unknown 11728 1726882194.96501: variable 'ansible_shell_type' from source: unknown 11728 1726882194.96508: variable 'ansible_shell_executable' from source: unknown 11728 1726882194.96517: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882194.96525: variable 'ansible_pipelining' from source: unknown 11728 1726882194.96531: variable 'ansible_timeout' from source: unknown 11728 1726882194.96538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882194.96680: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882194.96705: variable 'omit' from source: magic vars 11728 1726882194.96714: starting attempt loop 11728 1726882194.96802: running the handler 11728 1726882194.96842: variable 'lsr_net_profile_fingerprint' from source: set_fact 11728 1726882194.96852: Evaluated conditional (lsr_net_profile_fingerprint): True 11728 1726882194.96861: handler run complete 11728 1726882194.96878: attempt loop complete, returning result 11728 1726882194.96884: _execute() done 11728 1726882194.96889: dumping result to json 11728 1726882194.96901: done dumping result, returning 11728 1726882194.96917: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [12673a56-9f93-5c28-a762-0000000004dc] 11728 1726882194.96926: sending task result for task 12673a56-9f93-5c28-a762-0000000004dc ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882194.97180: no more pending results, returning what we have 11728 1726882194.97184: results queue empty 11728 1726882194.97185: checking for any_errors_fatal 11728 1726882194.97195: done checking for any_errors_fatal 11728 1726882194.97196: checking for max_fail_percentage 11728 1726882194.97198: done checking for max_fail_percentage 11728 1726882194.97199: checking to see if all hosts have failed and the running result is not ok 11728 1726882194.97200: done checking to see if all hosts have failed 11728 1726882194.97200: getting the remaining hosts for this loop 11728 1726882194.97202: done getting the remaining hosts for this loop 11728 1726882194.97206: getting the next task for host managed_node3 11728 1726882194.97216: done getting next task for host managed_node3 11728 1726882194.97221: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11728 1726882194.97225: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882194.97230: getting variables 11728 1726882194.97232: in VariableManager get_vars() 11728 1726882194.97262: Calling all_inventory to load vars for managed_node3 11728 1726882194.97265: Calling groups_inventory to load vars for managed_node3 11728 1726882194.97267: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882194.97277: Calling all_plugins_play to load vars for managed_node3 11728 1726882194.97280: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882194.97283: Calling groups_plugins_play to load vars for managed_node3 11728 1726882194.97799: done sending task result for task 12673a56-9f93-5c28-a762-0000000004dc 11728 1726882194.97805: WORKER PROCESS EXITING 11728 1726882195.13025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.16613: done with get_vars() 11728 1726882195.16670: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:55 -0400 (0:00:00.249) 0:00:20.022 ****** 11728 1726882195.16969: entering _queue_task() for managed_node3/include_tasks 11728 1726882195.17858: worker is 1 (out of 1 available) 11728 1726882195.17870: exiting _queue_task() for managed_node3/include_tasks 11728 1726882195.17881: done queuing things up, now waiting for results queue to drain 11728 1726882195.17882: waiting for pending results... 11728 1726882195.18526: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11728 1726882195.18786: in run() - task 12673a56-9f93-5c28-a762-0000000004e0 11728 1726882195.18835: variable 'ansible_search_path' from source: unknown 11728 1726882195.18839: variable 'ansible_search_path' from source: unknown 11728 1726882195.18922: calling self._execute() 11728 1726882195.19179: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.19188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.19200: variable 'omit' from source: magic vars 11728 1726882195.20638: variable 'ansible_distribution_major_version' from source: facts 11728 1726882195.20643: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882195.20646: _execute() done 11728 1726882195.20649: dumping result to json 11728 1726882195.20651: done dumping result, returning 11728 1726882195.20653: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-5c28-a762-0000000004e0] 11728 1726882195.20720: sending task result for task 12673a56-9f93-5c28-a762-0000000004e0 11728 1726882195.20806: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e0 11728 1726882195.20810: WORKER PROCESS EXITING 11728 1726882195.20842: no more pending results, returning what we have 11728 1726882195.20848: in VariableManager get_vars() 11728 1726882195.20890: Calling all_inventory to load vars for managed_node3 11728 1726882195.20896: Calling groups_inventory to load vars for managed_node3 11728 1726882195.20901: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882195.20920: Calling all_plugins_play to load vars for managed_node3 11728 1726882195.20924: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882195.20927: Calling groups_plugins_play to load vars for managed_node3 11728 1726882195.24510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.28313: done with get_vars() 11728 1726882195.28342: variable 'ansible_search_path' from source: unknown 11728 1726882195.28344: variable 'ansible_search_path' from source: unknown 11728 1726882195.28387: we have included files to process 11728 1726882195.28389: generating all_blocks data 11728 1726882195.28391: done generating all_blocks data 11728 1726882195.28570: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882195.28571: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882195.28635: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882195.30330: done processing included file 11728 1726882195.30333: iterating over new_blocks loaded from include file 11728 1726882195.30334: in VariableManager get_vars() 11728 1726882195.30353: done with get_vars() 11728 1726882195.30355: filtering new block on tags 11728 1726882195.30467: done filtering new block on tags 11728 1726882195.30471: in VariableManager get_vars() 11728 1726882195.30486: done with get_vars() 11728 1726882195.30488: filtering new block on tags 11728 1726882195.30560: done filtering new block on tags 11728 1726882195.30562: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11728 1726882195.30568: extending task lists for all hosts with included blocks 11728 1726882195.31222: done extending task lists 11728 1726882195.31224: done processing included files 11728 1726882195.31225: results queue empty 11728 1726882195.31226: checking for any_errors_fatal 11728 1726882195.31230: done checking for any_errors_fatal 11728 1726882195.31231: checking for max_fail_percentage 11728 1726882195.31232: done checking for max_fail_percentage 11728 1726882195.31233: checking to see if all hosts have failed and the running result is not ok 11728 1726882195.31234: done checking to see if all hosts have failed 11728 1726882195.31234: getting the remaining hosts for this loop 11728 1726882195.31236: done getting the remaining hosts for this loop 11728 1726882195.31239: getting the next task for host managed_node3 11728 1726882195.31243: done getting next task for host managed_node3 11728 1726882195.31246: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882195.31249: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882195.31252: getting variables 11728 1726882195.31253: in VariableManager get_vars() 11728 1726882195.31262: Calling all_inventory to load vars for managed_node3 11728 1726882195.31264: Calling groups_inventory to load vars for managed_node3 11728 1726882195.31267: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882195.31273: Calling all_plugins_play to load vars for managed_node3 11728 1726882195.31275: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882195.31278: Calling groups_plugins_play to load vars for managed_node3 11728 1726882195.33541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.35926: done with get_vars() 11728 1726882195.35955: done getting variables 11728 1726882195.36022: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:55 -0400 (0:00:00.190) 0:00:20.212 ****** 11728 1726882195.36058: entering _queue_task() for managed_node3/set_fact 11728 1726882195.36465: worker is 1 (out of 1 available) 11728 1726882195.36476: exiting _queue_task() for managed_node3/set_fact 11728 1726882195.36491: done queuing things up, now waiting for results queue to drain 11728 1726882195.36492: waiting for pending results... 11728 1726882195.37202: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882195.37208: in run() - task 12673a56-9f93-5c28-a762-000000000558 11728 1726882195.37210: variable 'ansible_search_path' from source: unknown 11728 1726882195.37213: variable 'ansible_search_path' from source: unknown 11728 1726882195.37216: calling self._execute() 11728 1726882195.37218: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.37222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.37225: variable 'omit' from source: magic vars 11728 1726882195.37467: variable 'ansible_distribution_major_version' from source: facts 11728 1726882195.37479: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882195.37485: variable 'omit' from source: magic vars 11728 1726882195.37555: variable 'omit' from source: magic vars 11728 1726882195.37589: variable 'omit' from source: magic vars 11728 1726882195.37641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882195.37677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882195.37703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882195.37722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.37744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.37772: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882195.37776: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.37779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.37888: Set connection var ansible_connection to ssh 11728 1726882195.37902: Set connection var ansible_shell_executable to /bin/sh 11728 1726882195.37998: Set connection var ansible_timeout to 10 11728 1726882195.38001: Set connection var ansible_shell_type to sh 11728 1726882195.38004: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882195.38006: Set connection var ansible_pipelining to False 11728 1726882195.38009: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.38011: variable 'ansible_connection' from source: unknown 11728 1726882195.38014: variable 'ansible_module_compression' from source: unknown 11728 1726882195.38016: variable 'ansible_shell_type' from source: unknown 11728 1726882195.38018: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.38020: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.38022: variable 'ansible_pipelining' from source: unknown 11728 1726882195.38024: variable 'ansible_timeout' from source: unknown 11728 1726882195.38026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.38287: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882195.38290: variable 'omit' from source: magic vars 11728 1726882195.38292: starting attempt loop 11728 1726882195.38296: running the handler 11728 1726882195.38297: handler run complete 11728 1726882195.38299: attempt loop complete, returning result 11728 1726882195.38301: _execute() done 11728 1726882195.38302: dumping result to json 11728 1726882195.38304: done dumping result, returning 11728 1726882195.38306: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-5c28-a762-000000000558] 11728 1726882195.38307: sending task result for task 12673a56-9f93-5c28-a762-000000000558 11728 1726882195.38362: done sending task result for task 12673a56-9f93-5c28-a762-000000000558 11728 1726882195.38365: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11728 1726882195.38439: no more pending results, returning what we have 11728 1726882195.38444: results queue empty 11728 1726882195.38446: checking for any_errors_fatal 11728 1726882195.38447: done checking for any_errors_fatal 11728 1726882195.38448: checking for max_fail_percentage 11728 1726882195.38450: done checking for max_fail_percentage 11728 1726882195.38451: checking to see if all hosts have failed and the running result is not ok 11728 1726882195.38452: done checking to see if all hosts have failed 11728 1726882195.38452: getting the remaining hosts for this loop 11728 1726882195.38454: done getting the remaining hosts for this loop 11728 1726882195.38458: getting the next task for host managed_node3 11728 1726882195.38467: done getting next task for host managed_node3 11728 1726882195.38470: ^ task is: TASK: Stat profile file 11728 1726882195.38476: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882195.38481: getting variables 11728 1726882195.38482: in VariableManager get_vars() 11728 1726882195.38521: Calling all_inventory to load vars for managed_node3 11728 1726882195.38524: Calling groups_inventory to load vars for managed_node3 11728 1726882195.38527: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882195.38538: Calling all_plugins_play to load vars for managed_node3 11728 1726882195.38541: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882195.38545: Calling groups_plugins_play to load vars for managed_node3 11728 1726882195.40623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.42289: done with get_vars() 11728 1726882195.42315: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:55 -0400 (0:00:00.063) 0:00:20.276 ****** 11728 1726882195.42424: entering _queue_task() for managed_node3/stat 11728 1726882195.42764: worker is 1 (out of 1 available) 11728 1726882195.42777: exiting _queue_task() for managed_node3/stat 11728 1726882195.42790: done queuing things up, now waiting for results queue to drain 11728 1726882195.42791: waiting for pending results... 11728 1726882195.43167: running TaskExecutor() for managed_node3/TASK: Stat profile file 11728 1726882195.43173: in run() - task 12673a56-9f93-5c28-a762-000000000559 11728 1726882195.43186: variable 'ansible_search_path' from source: unknown 11728 1726882195.43190: variable 'ansible_search_path' from source: unknown 11728 1726882195.43251: calling self._execute() 11728 1726882195.43368: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.43372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.43378: variable 'omit' from source: magic vars 11728 1726882195.44327: variable 'ansible_distribution_major_version' from source: facts 11728 1726882195.44341: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882195.44347: variable 'omit' from source: magic vars 11728 1726882195.44581: variable 'omit' from source: magic vars 11728 1726882195.44701: variable 'profile' from source: include params 11728 1726882195.44707: variable 'bond_port_profile' from source: include params 11728 1726882195.44907: variable 'bond_port_profile' from source: include params 11728 1726882195.44998: variable 'omit' from source: magic vars 11728 1726882195.45047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882195.45160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882195.45181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882195.45240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.45340: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.45347: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882195.45349: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.45352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.45406: Set connection var ansible_connection to ssh 11728 1726882195.45422: Set connection var ansible_shell_executable to /bin/sh 11728 1726882195.45427: Set connection var ansible_timeout to 10 11728 1726882195.45430: Set connection var ansible_shell_type to sh 11728 1726882195.45445: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882195.45451: Set connection var ansible_pipelining to False 11728 1726882195.45471: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.45474: variable 'ansible_connection' from source: unknown 11728 1726882195.45476: variable 'ansible_module_compression' from source: unknown 11728 1726882195.45479: variable 'ansible_shell_type' from source: unknown 11728 1726882195.45481: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.45483: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.45487: variable 'ansible_pipelining' from source: unknown 11728 1726882195.45490: variable 'ansible_timeout' from source: unknown 11728 1726882195.45492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.45717: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882195.45776: variable 'omit' from source: magic vars 11728 1726882195.45779: starting attempt loop 11728 1726882195.45782: running the handler 11728 1726882195.45784: _low_level_execute_command(): starting 11728 1726882195.45786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882195.46485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882195.46649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.46730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.46733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.48353: stdout chunk (state=3): >>>/root <<< 11728 1726882195.48491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.48496: stdout chunk (state=3): >>><<< 11728 1726882195.48798: stderr chunk (state=3): >>><<< 11728 1726882195.48803: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.48807: _low_level_execute_command(): starting 11728 1726882195.48810: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507 `" && echo ansible-tmp-1726882195.4862244-12719-121208659205507="` echo /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507 `" ) && sleep 0' 11728 1726882195.49491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882195.49506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882195.49517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.49531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.49543: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882195.49555: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882195.49559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.49579: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882195.49720: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.49725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.49764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.51615: stdout chunk (state=3): >>>ansible-tmp-1726882195.4862244-12719-121208659205507=/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507 <<< 11728 1726882195.51716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.51829: stderr chunk (state=3): >>><<< 11728 1726882195.51864: stdout chunk (state=3): >>><<< 11728 1726882195.51886: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882195.4862244-12719-121208659205507=/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.51934: variable 'ansible_module_compression' from source: unknown 11728 1726882195.51999: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882195.52099: variable 'ansible_facts' from source: unknown 11728 1726882195.52522: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py 11728 1726882195.52690: Sending initial data 11728 1726882195.52767: Sent initial data (153 bytes) 11728 1726882195.54017: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.54065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882195.54078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.54140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.55628: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882195.55683: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882195.55752: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9rk3w7vq /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py <<< 11728 1726882195.55756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py" <<< 11728 1726882195.56400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9rk3w7vq" to remote "/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py" <<< 11728 1726882195.56958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.56961: stdout chunk (state=3): >>><<< 11728 1726882195.56977: stderr chunk (state=3): >>><<< 11728 1726882195.57114: done transferring module to remote 11728 1726882195.57125: _low_level_execute_command(): starting 11728 1726882195.57130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/ /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py && sleep 0' 11728 1726882195.57753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882195.57768: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882195.57783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.57804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.57821: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882195.57910: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882195.57930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.57953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.58029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.59946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.59951: stdout chunk (state=3): >>><<< 11728 1726882195.59954: stderr chunk (state=3): >>><<< 11728 1726882195.59970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.59974: _low_level_execute_command(): starting 11728 1726882195.59978: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/AnsiballZ_stat.py && sleep 0' 11728 1726882195.60982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882195.61005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882195.61019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.61036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.61054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882195.61100: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882195.61114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.61189: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.61231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.61253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.61330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.76306: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882195.77820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882195.77824: stdout chunk (state=3): >>><<< 11728 1726882195.77826: stderr chunk (state=3): >>><<< 11728 1726882195.77829: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882195.77865: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882195.77881: _low_level_execute_command(): starting 11728 1726882195.77890: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882195.4862244-12719-121208659205507/ > /dev/null 2>&1 && sleep 0' 11728 1726882195.78528: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882195.78578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882195.78583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.78614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882195.78690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882195.78712: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.78759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.78801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.80631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.80635: stdout chunk (state=3): >>><<< 11728 1726882195.80638: stderr chunk (state=3): >>><<< 11728 1726882195.80799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.80803: handler run complete 11728 1726882195.80806: attempt loop complete, returning result 11728 1726882195.80808: _execute() done 11728 1726882195.80810: dumping result to json 11728 1726882195.80812: done dumping result, returning 11728 1726882195.80814: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-5c28-a762-000000000559] 11728 1726882195.80817: sending task result for task 12673a56-9f93-5c28-a762-000000000559 11728 1726882195.80892: done sending task result for task 12673a56-9f93-5c28-a762-000000000559 11728 1726882195.80898: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11728 1726882195.80959: no more pending results, returning what we have 11728 1726882195.80963: results queue empty 11728 1726882195.80964: checking for any_errors_fatal 11728 1726882195.80970: done checking for any_errors_fatal 11728 1726882195.80971: checking for max_fail_percentage 11728 1726882195.80972: done checking for max_fail_percentage 11728 1726882195.80973: checking to see if all hosts have failed and the running result is not ok 11728 1726882195.80973: done checking to see if all hosts have failed 11728 1726882195.80974: getting the remaining hosts for this loop 11728 1726882195.80976: done getting the remaining hosts for this loop 11728 1726882195.80979: getting the next task for host managed_node3 11728 1726882195.80987: done getting next task for host managed_node3 11728 1726882195.80989: ^ task is: TASK: Set NM profile exist flag based on the profile files 11728 1726882195.81000: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882195.81005: getting variables 11728 1726882195.81006: in VariableManager get_vars() 11728 1726882195.81038: Calling all_inventory to load vars for managed_node3 11728 1726882195.81041: Calling groups_inventory to load vars for managed_node3 11728 1726882195.81044: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882195.81054: Calling all_plugins_play to load vars for managed_node3 11728 1726882195.81057: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882195.81059: Calling groups_plugins_play to load vars for managed_node3 11728 1726882195.82510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.84034: done with get_vars() 11728 1726882195.84059: done getting variables 11728 1726882195.84125: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:55 -0400 (0:00:00.417) 0:00:20.694 ****** 11728 1726882195.84163: entering _queue_task() for managed_node3/set_fact 11728 1726882195.84489: worker is 1 (out of 1 available) 11728 1726882195.84606: exiting _queue_task() for managed_node3/set_fact 11728 1726882195.84618: done queuing things up, now waiting for results queue to drain 11728 1726882195.84619: waiting for pending results... 11728 1726882195.84859: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11728 1726882195.85063: in run() - task 12673a56-9f93-5c28-a762-00000000055a 11728 1726882195.85068: variable 'ansible_search_path' from source: unknown 11728 1726882195.85072: variable 'ansible_search_path' from source: unknown 11728 1726882195.85075: calling self._execute() 11728 1726882195.85139: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.85152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.85168: variable 'omit' from source: magic vars 11728 1726882195.85549: variable 'ansible_distribution_major_version' from source: facts 11728 1726882195.85567: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882195.85695: variable 'profile_stat' from source: set_fact 11728 1726882195.85714: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882195.85723: when evaluation is False, skipping this task 11728 1726882195.85736: _execute() done 11728 1726882195.85745: dumping result to json 11728 1726882195.85802: done dumping result, returning 11728 1726882195.85805: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-5c28-a762-00000000055a] 11728 1726882195.85808: sending task result for task 12673a56-9f93-5c28-a762-00000000055a 11728 1726882195.85876: done sending task result for task 12673a56-9f93-5c28-a762-00000000055a 11728 1726882195.85879: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882195.85931: no more pending results, returning what we have 11728 1726882195.85935: results queue empty 11728 1726882195.85937: checking for any_errors_fatal 11728 1726882195.85945: done checking for any_errors_fatal 11728 1726882195.85946: checking for max_fail_percentage 11728 1726882195.85948: done checking for max_fail_percentage 11728 1726882195.85949: checking to see if all hosts have failed and the running result is not ok 11728 1726882195.85950: done checking to see if all hosts have failed 11728 1726882195.85950: getting the remaining hosts for this loop 11728 1726882195.85952: done getting the remaining hosts for this loop 11728 1726882195.85956: getting the next task for host managed_node3 11728 1726882195.85964: done getting next task for host managed_node3 11728 1726882195.85967: ^ task is: TASK: Get NM profile info 11728 1726882195.85974: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882195.85980: getting variables 11728 1726882195.85981: in VariableManager get_vars() 11728 1726882195.86214: Calling all_inventory to load vars for managed_node3 11728 1726882195.86217: Calling groups_inventory to load vars for managed_node3 11728 1726882195.86220: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882195.86230: Calling all_plugins_play to load vars for managed_node3 11728 1726882195.86232: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882195.86235: Calling groups_plugins_play to load vars for managed_node3 11728 1726882195.87677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882195.89232: done with get_vars() 11728 1726882195.89254: done getting variables 11728 1726882195.89330: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:55 -0400 (0:00:00.052) 0:00:20.746 ****** 11728 1726882195.89369: entering _queue_task() for managed_node3/shell 11728 1726882195.89672: worker is 1 (out of 1 available) 11728 1726882195.89687: exiting _queue_task() for managed_node3/shell 11728 1726882195.89703: done queuing things up, now waiting for results queue to drain 11728 1726882195.89705: waiting for pending results... 11728 1726882195.89873: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11728 1726882195.89965: in run() - task 12673a56-9f93-5c28-a762-00000000055b 11728 1726882195.89978: variable 'ansible_search_path' from source: unknown 11728 1726882195.89981: variable 'ansible_search_path' from source: unknown 11728 1726882195.90013: calling self._execute() 11728 1726882195.90084: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.90088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.90100: variable 'omit' from source: magic vars 11728 1726882195.90369: variable 'ansible_distribution_major_version' from source: facts 11728 1726882195.90379: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882195.90385: variable 'omit' from source: magic vars 11728 1726882195.90427: variable 'omit' from source: magic vars 11728 1726882195.90499: variable 'profile' from source: include params 11728 1726882195.90503: variable 'bond_port_profile' from source: include params 11728 1726882195.90547: variable 'bond_port_profile' from source: include params 11728 1726882195.90562: variable 'omit' from source: magic vars 11728 1726882195.90599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882195.90626: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882195.90641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882195.90654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.90664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882195.90688: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882195.90692: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.90698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.90761: Set connection var ansible_connection to ssh 11728 1726882195.90769: Set connection var ansible_shell_executable to /bin/sh 11728 1726882195.90775: Set connection var ansible_timeout to 10 11728 1726882195.90777: Set connection var ansible_shell_type to sh 11728 1726882195.90783: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882195.90788: Set connection var ansible_pipelining to False 11728 1726882195.90810: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.90813: variable 'ansible_connection' from source: unknown 11728 1726882195.90816: variable 'ansible_module_compression' from source: unknown 11728 1726882195.90818: variable 'ansible_shell_type' from source: unknown 11728 1726882195.90821: variable 'ansible_shell_executable' from source: unknown 11728 1726882195.90823: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882195.90825: variable 'ansible_pipelining' from source: unknown 11728 1726882195.90828: variable 'ansible_timeout' from source: unknown 11728 1726882195.90841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882195.90931: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882195.90948: variable 'omit' from source: magic vars 11728 1726882195.90951: starting attempt loop 11728 1726882195.90953: running the handler 11728 1726882195.90956: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882195.90976: _low_level_execute_command(): starting 11728 1726882195.90983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882195.91476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.91522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882195.91525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882195.91528: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.91530: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.91572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882195.91575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.91577: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.91632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.93176: stdout chunk (state=3): >>>/root <<< 11728 1726882195.93279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.93307: stderr chunk (state=3): >>><<< 11728 1726882195.93310: stdout chunk (state=3): >>><<< 11728 1726882195.93331: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.93342: _low_level_execute_command(): starting 11728 1726882195.93348: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992 `" && echo ansible-tmp-1726882195.9333062-12747-224983577300992="` echo /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992 `" ) && sleep 0' 11728 1726882195.93790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.93798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.93801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882195.93803: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.93805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.93845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.93857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.93914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.95755: stdout chunk (state=3): >>>ansible-tmp-1726882195.9333062-12747-224983577300992=/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992 <<< 11728 1726882195.96005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.96009: stdout chunk (state=3): >>><<< 11728 1726882195.96012: stderr chunk (state=3): >>><<< 11728 1726882195.96014: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882195.9333062-12747-224983577300992=/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882195.96017: variable 'ansible_module_compression' from source: unknown 11728 1726882195.96028: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882195.96071: variable 'ansible_facts' from source: unknown 11728 1726882195.96191: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py 11728 1726882195.96352: Sending initial data 11728 1726882195.96447: Sent initial data (156 bytes) 11728 1726882195.96938: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.96967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882195.96970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882195.96972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.96975: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882195.96977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882195.97041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882195.97046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882195.97085: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882195.98567: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882195.98571: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882195.98614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882195.98656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpq0aoxe6k /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py <<< 11728 1726882195.98663: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py" <<< 11728 1726882195.98700: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpq0aoxe6k" to remote "/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py" <<< 11728 1726882195.98708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py" <<< 11728 1726882195.99299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882195.99664: stderr chunk (state=3): >>><<< 11728 1726882195.99670: stdout chunk (state=3): >>><<< 11728 1726882195.99673: done transferring module to remote 11728 1726882195.99675: _low_level_execute_command(): starting 11728 1726882195.99677: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/ /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py && sleep 0' 11728 1726882196.00418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882196.00611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882196.00632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882196.00645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.00713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.00799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.00846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.02524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.02550: stderr chunk (state=3): >>><<< 11728 1726882196.02553: stdout chunk (state=3): >>><<< 11728 1726882196.02566: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882196.02569: _low_level_execute_command(): starting 11728 1726882196.02574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/AnsiballZ_command.py && sleep 0' 11728 1726882196.03032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882196.03035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882196.03038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.03103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.03107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.03166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.20265: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:56.180525", "end": "2024-09-20 21:29:56.201010", "delta": "0:00:00.020485", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882196.21692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882196.21753: stderr chunk (state=3): >>><<< 11728 1726882196.21757: stdout chunk (state=3): >>><<< 11728 1726882196.21815: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:56.180525", "end": "2024-09-20 21:29:56.201010", "delta": "0:00:00.020485", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882196.21844: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882196.21852: _low_level_execute_command(): starting 11728 1726882196.21854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882195.9333062-12747-224983577300992/ > /dev/null 2>&1 && sleep 0' 11728 1726882196.22582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.22588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.22630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.24405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.24416: stderr chunk (state=3): >>><<< 11728 1726882196.24426: stdout chunk (state=3): >>><<< 11728 1726882196.24443: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882196.24449: handler run complete 11728 1726882196.24468: Evaluated conditional (False): False 11728 1726882196.24477: attempt loop complete, returning result 11728 1726882196.24483: _execute() done 11728 1726882196.24488: dumping result to json 11728 1726882196.24548: done dumping result, returning 11728 1726882196.24551: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-5c28-a762-00000000055b] 11728 1726882196.24553: sending task result for task 12673a56-9f93-5c28-a762-00000000055b 11728 1726882196.24635: done sending task result for task 12673a56-9f93-5c28-a762-00000000055b 11728 1726882196.24639: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020485", "end": "2024-09-20 21:29:56.201010", "rc": 0, "start": "2024-09-20 21:29:56.180525" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11728 1726882196.24740: no more pending results, returning what we have 11728 1726882196.24744: results queue empty 11728 1726882196.24746: checking for any_errors_fatal 11728 1726882196.24774: done checking for any_errors_fatal 11728 1726882196.24775: checking for max_fail_percentage 11728 1726882196.24777: done checking for max_fail_percentage 11728 1726882196.24778: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.24778: done checking to see if all hosts have failed 11728 1726882196.24779: getting the remaining hosts for this loop 11728 1726882196.24781: done getting the remaining hosts for this loop 11728 1726882196.24785: getting the next task for host managed_node3 11728 1726882196.24792: done getting next task for host managed_node3 11728 1726882196.24798: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882196.24804: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.24808: getting variables 11728 1726882196.24809: in VariableManager get_vars() 11728 1726882196.24842: Calling all_inventory to load vars for managed_node3 11728 1726882196.24847: Calling groups_inventory to load vars for managed_node3 11728 1726882196.24850: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.24907: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.24913: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.24917: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.25910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.27787: done with get_vars() 11728 1726882196.27820: done getting variables 11728 1726882196.27863: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:56 -0400 (0:00:00.385) 0:00:21.131 ****** 11728 1726882196.27887: entering _queue_task() for managed_node3/set_fact 11728 1726882196.28143: worker is 1 (out of 1 available) 11728 1726882196.28156: exiting _queue_task() for managed_node3/set_fact 11728 1726882196.28171: done queuing things up, now waiting for results queue to drain 11728 1726882196.28172: waiting for pending results... 11728 1726882196.28350: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882196.28441: in run() - task 12673a56-9f93-5c28-a762-00000000055c 11728 1726882196.28454: variable 'ansible_search_path' from source: unknown 11728 1726882196.28457: variable 'ansible_search_path' from source: unknown 11728 1726882196.28486: calling self._execute() 11728 1726882196.28562: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.28566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.28576: variable 'omit' from source: magic vars 11728 1726882196.28877: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.28887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.28978: variable 'nm_profile_exists' from source: set_fact 11728 1726882196.28989: Evaluated conditional (nm_profile_exists.rc == 0): True 11728 1726882196.28998: variable 'omit' from source: magic vars 11728 1726882196.29036: variable 'omit' from source: magic vars 11728 1726882196.29062: variable 'omit' from source: magic vars 11728 1726882196.29097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.29123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.29138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.29152: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.29168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.29190: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.29197: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.29200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.29310: Set connection var ansible_connection to ssh 11728 1726882196.29319: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.29324: Set connection var ansible_timeout to 10 11728 1726882196.29326: Set connection var ansible_shell_type to sh 11728 1726882196.29334: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.29349: Set connection var ansible_pipelining to False 11728 1726882196.29372: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.29377: variable 'ansible_connection' from source: unknown 11728 1726882196.29380: variable 'ansible_module_compression' from source: unknown 11728 1726882196.29382: variable 'ansible_shell_type' from source: unknown 11728 1726882196.29384: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.29386: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.29388: variable 'ansible_pipelining' from source: unknown 11728 1726882196.29392: variable 'ansible_timeout' from source: unknown 11728 1726882196.29413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.29546: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882196.29567: variable 'omit' from source: magic vars 11728 1726882196.29570: starting attempt loop 11728 1726882196.29573: running the handler 11728 1726882196.29608: handler run complete 11728 1726882196.29612: attempt loop complete, returning result 11728 1726882196.29638: _execute() done 11728 1726882196.29642: dumping result to json 11728 1726882196.29644: done dumping result, returning 11728 1726882196.29647: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-5c28-a762-00000000055c] 11728 1726882196.29649: sending task result for task 12673a56-9f93-5c28-a762-00000000055c 11728 1726882196.29749: done sending task result for task 12673a56-9f93-5c28-a762-00000000055c 11728 1726882196.29752: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11728 1726882196.29817: no more pending results, returning what we have 11728 1726882196.29820: results queue empty 11728 1726882196.29821: checking for any_errors_fatal 11728 1726882196.29830: done checking for any_errors_fatal 11728 1726882196.29831: checking for max_fail_percentage 11728 1726882196.29835: done checking for max_fail_percentage 11728 1726882196.29835: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.29836: done checking to see if all hosts have failed 11728 1726882196.29837: getting the remaining hosts for this loop 11728 1726882196.29838: done getting the remaining hosts for this loop 11728 1726882196.29842: getting the next task for host managed_node3 11728 1726882196.29852: done getting next task for host managed_node3 11728 1726882196.29854: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882196.29860: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.29863: getting variables 11728 1726882196.29864: in VariableManager get_vars() 11728 1726882196.29901: Calling all_inventory to load vars for managed_node3 11728 1726882196.29904: Calling groups_inventory to load vars for managed_node3 11728 1726882196.29907: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.29917: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.29919: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.29922: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.31078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.32185: done with get_vars() 11728 1726882196.32211: done getting variables 11728 1726882196.32252: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.32341: variable 'profile' from source: include params 11728 1726882196.32344: variable 'bond_port_profile' from source: include params 11728 1726882196.32382: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:56 -0400 (0:00:00.045) 0:00:21.176 ****** 11728 1726882196.32410: entering _queue_task() for managed_node3/command 11728 1726882196.32733: worker is 1 (out of 1 available) 11728 1726882196.32747: exiting _queue_task() for managed_node3/command 11728 1726882196.32761: done queuing things up, now waiting for results queue to drain 11728 1726882196.32763: waiting for pending results... 11728 1726882196.32964: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11728 1726882196.33070: in run() - task 12673a56-9f93-5c28-a762-00000000055e 11728 1726882196.33081: variable 'ansible_search_path' from source: unknown 11728 1726882196.33085: variable 'ansible_search_path' from source: unknown 11728 1726882196.33120: calling self._execute() 11728 1726882196.33180: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.33184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.33197: variable 'omit' from source: magic vars 11728 1726882196.33527: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.33531: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.33641: variable 'profile_stat' from source: set_fact 11728 1726882196.33650: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882196.33653: when evaluation is False, skipping this task 11728 1726882196.33656: _execute() done 11728 1726882196.33659: dumping result to json 11728 1726882196.33662: done dumping result, returning 11728 1726882196.33686: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-5c28-a762-00000000055e] 11728 1726882196.33689: sending task result for task 12673a56-9f93-5c28-a762-00000000055e 11728 1726882196.33782: done sending task result for task 12673a56-9f93-5c28-a762-00000000055e 11728 1726882196.33785: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882196.33876: no more pending results, returning what we have 11728 1726882196.33880: results queue empty 11728 1726882196.33881: checking for any_errors_fatal 11728 1726882196.33887: done checking for any_errors_fatal 11728 1726882196.33888: checking for max_fail_percentage 11728 1726882196.33891: done checking for max_fail_percentage 11728 1726882196.33892: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.33894: done checking to see if all hosts have failed 11728 1726882196.33895: getting the remaining hosts for this loop 11728 1726882196.33897: done getting the remaining hosts for this loop 11728 1726882196.33900: getting the next task for host managed_node3 11728 1726882196.33908: done getting next task for host managed_node3 11728 1726882196.33911: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882196.33917: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.33921: getting variables 11728 1726882196.33922: in VariableManager get_vars() 11728 1726882196.33950: Calling all_inventory to load vars for managed_node3 11728 1726882196.33952: Calling groups_inventory to load vars for managed_node3 11728 1726882196.33955: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.33964: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.33967: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.33969: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.34888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.36050: done with get_vars() 11728 1726882196.36063: done getting variables 11728 1726882196.36109: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.36211: variable 'profile' from source: include params 11728 1726882196.36215: variable 'bond_port_profile' from source: include params 11728 1726882196.36264: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:56 -0400 (0:00:00.038) 0:00:21.215 ****** 11728 1726882196.36291: entering _queue_task() for managed_node3/set_fact 11728 1726882196.36485: worker is 1 (out of 1 available) 11728 1726882196.36499: exiting _queue_task() for managed_node3/set_fact 11728 1726882196.36511: done queuing things up, now waiting for results queue to drain 11728 1726882196.36512: waiting for pending results... 11728 1726882196.36681: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11728 1726882196.36770: in run() - task 12673a56-9f93-5c28-a762-00000000055f 11728 1726882196.36780: variable 'ansible_search_path' from source: unknown 11728 1726882196.36784: variable 'ansible_search_path' from source: unknown 11728 1726882196.36816: calling self._execute() 11728 1726882196.36884: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.36888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.36899: variable 'omit' from source: magic vars 11728 1726882196.37167: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.37179: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.37263: variable 'profile_stat' from source: set_fact 11728 1726882196.37272: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882196.37275: when evaluation is False, skipping this task 11728 1726882196.37280: _execute() done 11728 1726882196.37283: dumping result to json 11728 1726882196.37286: done dumping result, returning 11728 1726882196.37303: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-5c28-a762-00000000055f] 11728 1726882196.37307: sending task result for task 12673a56-9f93-5c28-a762-00000000055f 11728 1726882196.37379: done sending task result for task 12673a56-9f93-5c28-a762-00000000055f 11728 1726882196.37382: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882196.37441: no more pending results, returning what we have 11728 1726882196.37445: results queue empty 11728 1726882196.37446: checking for any_errors_fatal 11728 1726882196.37450: done checking for any_errors_fatal 11728 1726882196.37451: checking for max_fail_percentage 11728 1726882196.37452: done checking for max_fail_percentage 11728 1726882196.37453: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.37453: done checking to see if all hosts have failed 11728 1726882196.37454: getting the remaining hosts for this loop 11728 1726882196.37456: done getting the remaining hosts for this loop 11728 1726882196.37459: getting the next task for host managed_node3 11728 1726882196.37465: done getting next task for host managed_node3 11728 1726882196.37467: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11728 1726882196.37472: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.37475: getting variables 11728 1726882196.37476: in VariableManager get_vars() 11728 1726882196.37503: Calling all_inventory to load vars for managed_node3 11728 1726882196.37506: Calling groups_inventory to load vars for managed_node3 11728 1726882196.37508: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.37519: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.37522: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.37524: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.38321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.39297: done with get_vars() 11728 1726882196.39313: done getting variables 11728 1726882196.39365: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.39442: variable 'profile' from source: include params 11728 1726882196.39444: variable 'bond_port_profile' from source: include params 11728 1726882196.39492: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:56 -0400 (0:00:00.032) 0:00:21.247 ****** 11728 1726882196.39519: entering _queue_task() for managed_node3/command 11728 1726882196.39723: worker is 1 (out of 1 available) 11728 1726882196.39735: exiting _queue_task() for managed_node3/command 11728 1726882196.39747: done queuing things up, now waiting for results queue to drain 11728 1726882196.39748: waiting for pending results... 11728 1726882196.39928: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 11728 1726882196.40012: in run() - task 12673a56-9f93-5c28-a762-000000000560 11728 1726882196.40023: variable 'ansible_search_path' from source: unknown 11728 1726882196.40027: variable 'ansible_search_path' from source: unknown 11728 1726882196.40054: calling self._execute() 11728 1726882196.40145: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.40149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.40152: variable 'omit' from source: magic vars 11728 1726882196.40469: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.40497: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.40603: variable 'profile_stat' from source: set_fact 11728 1726882196.40612: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882196.40615: when evaluation is False, skipping this task 11728 1726882196.40618: _execute() done 11728 1726882196.40621: dumping result to json 11728 1726882196.40623: done dumping result, returning 11728 1726882196.40636: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-5c28-a762-000000000560] 11728 1726882196.40646: sending task result for task 12673a56-9f93-5c28-a762-000000000560 11728 1726882196.40736: done sending task result for task 12673a56-9f93-5c28-a762-000000000560 11728 1726882196.40739: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882196.40822: no more pending results, returning what we have 11728 1726882196.40826: results queue empty 11728 1726882196.40827: checking for any_errors_fatal 11728 1726882196.40833: done checking for any_errors_fatal 11728 1726882196.40834: checking for max_fail_percentage 11728 1726882196.40835: done checking for max_fail_percentage 11728 1726882196.40836: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.40837: done checking to see if all hosts have failed 11728 1726882196.40837: getting the remaining hosts for this loop 11728 1726882196.40839: done getting the remaining hosts for this loop 11728 1726882196.40841: getting the next task for host managed_node3 11728 1726882196.40847: done getting next task for host managed_node3 11728 1726882196.40849: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11728 1726882196.40854: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.40858: getting variables 11728 1726882196.40859: in VariableManager get_vars() 11728 1726882196.40886: Calling all_inventory to load vars for managed_node3 11728 1726882196.40889: Calling groups_inventory to load vars for managed_node3 11728 1726882196.40895: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.40906: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.40908: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.40911: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.41836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.42952: done with get_vars() 11728 1726882196.42972: done getting variables 11728 1726882196.43018: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.43119: variable 'profile' from source: include params 11728 1726882196.43125: variable 'bond_port_profile' from source: include params 11728 1726882196.43172: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:56 -0400 (0:00:00.036) 0:00:21.284 ****** 11728 1726882196.43209: entering _queue_task() for managed_node3/set_fact 11728 1726882196.43429: worker is 1 (out of 1 available) 11728 1726882196.43442: exiting _queue_task() for managed_node3/set_fact 11728 1726882196.43456: done queuing things up, now waiting for results queue to drain 11728 1726882196.43458: waiting for pending results... 11728 1726882196.43630: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11728 1726882196.43713: in run() - task 12673a56-9f93-5c28-a762-000000000561 11728 1726882196.43725: variable 'ansible_search_path' from source: unknown 11728 1726882196.43729: variable 'ansible_search_path' from source: unknown 11728 1726882196.43756: calling self._execute() 11728 1726882196.43842: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.43845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.43865: variable 'omit' from source: magic vars 11728 1726882196.44164: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.44173: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.44256: variable 'profile_stat' from source: set_fact 11728 1726882196.44265: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882196.44268: when evaluation is False, skipping this task 11728 1726882196.44271: _execute() done 11728 1726882196.44274: dumping result to json 11728 1726882196.44276: done dumping result, returning 11728 1726882196.44282: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-5c28-a762-000000000561] 11728 1726882196.44286: sending task result for task 12673a56-9f93-5c28-a762-000000000561 11728 1726882196.44377: done sending task result for task 12673a56-9f93-5c28-a762-000000000561 11728 1726882196.44380: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882196.44437: no more pending results, returning what we have 11728 1726882196.44441: results queue empty 11728 1726882196.44442: checking for any_errors_fatal 11728 1726882196.44448: done checking for any_errors_fatal 11728 1726882196.44449: checking for max_fail_percentage 11728 1726882196.44450: done checking for max_fail_percentage 11728 1726882196.44451: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.44451: done checking to see if all hosts have failed 11728 1726882196.44452: getting the remaining hosts for this loop 11728 1726882196.44454: done getting the remaining hosts for this loop 11728 1726882196.44457: getting the next task for host managed_node3 11728 1726882196.44465: done getting next task for host managed_node3 11728 1726882196.44467: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11728 1726882196.44474: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.44478: getting variables 11728 1726882196.44480: in VariableManager get_vars() 11728 1726882196.44534: Calling all_inventory to load vars for managed_node3 11728 1726882196.44536: Calling groups_inventory to load vars for managed_node3 11728 1726882196.44539: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.44548: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.44550: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.44553: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.45323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.46260: done with get_vars() 11728 1726882196.46275: done getting variables 11728 1726882196.46319: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.46398: variable 'profile' from source: include params 11728 1726882196.46401: variable 'bond_port_profile' from source: include params 11728 1726882196.46438: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:56 -0400 (0:00:00.032) 0:00:21.317 ****** 11728 1726882196.46461: entering _queue_task() for managed_node3/assert 11728 1726882196.46665: worker is 1 (out of 1 available) 11728 1726882196.46677: exiting _queue_task() for managed_node3/assert 11728 1726882196.46689: done queuing things up, now waiting for results queue to drain 11728 1726882196.46690: waiting for pending results... 11728 1726882196.46885: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 11728 1726882196.46966: in run() - task 12673a56-9f93-5c28-a762-0000000004e1 11728 1726882196.46977: variable 'ansible_search_path' from source: unknown 11728 1726882196.46981: variable 'ansible_search_path' from source: unknown 11728 1726882196.47010: calling self._execute() 11728 1726882196.47076: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.47081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.47089: variable 'omit' from source: magic vars 11728 1726882196.47343: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.47353: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.47358: variable 'omit' from source: magic vars 11728 1726882196.47399: variable 'omit' from source: magic vars 11728 1726882196.47464: variable 'profile' from source: include params 11728 1726882196.47467: variable 'bond_port_profile' from source: include params 11728 1726882196.47529: variable 'bond_port_profile' from source: include params 11728 1726882196.47550: variable 'omit' from source: magic vars 11728 1726882196.47584: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.47617: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.47650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.47680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.47686: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.47769: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.47775: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.47779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.47874: Set connection var ansible_connection to ssh 11728 1726882196.47880: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.47894: Set connection var ansible_timeout to 10 11728 1726882196.47897: Set connection var ansible_shell_type to sh 11728 1726882196.47900: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.47908: Set connection var ansible_pipelining to False 11728 1726882196.47924: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.47927: variable 'ansible_connection' from source: unknown 11728 1726882196.47929: variable 'ansible_module_compression' from source: unknown 11728 1726882196.47931: variable 'ansible_shell_type' from source: unknown 11728 1726882196.47934: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.47936: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.47940: variable 'ansible_pipelining' from source: unknown 11728 1726882196.47943: variable 'ansible_timeout' from source: unknown 11728 1726882196.47957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.48050: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882196.48061: variable 'omit' from source: magic vars 11728 1726882196.48064: starting attempt loop 11728 1726882196.48068: running the handler 11728 1726882196.48145: variable 'lsr_net_profile_exists' from source: set_fact 11728 1726882196.48149: Evaluated conditional (lsr_net_profile_exists): True 11728 1726882196.48154: handler run complete 11728 1726882196.48168: attempt loop complete, returning result 11728 1726882196.48171: _execute() done 11728 1726882196.48174: dumping result to json 11728 1726882196.48176: done dumping result, returning 11728 1726882196.48187: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [12673a56-9f93-5c28-a762-0000000004e1] 11728 1726882196.48189: sending task result for task 12673a56-9f93-5c28-a762-0000000004e1 11728 1726882196.48272: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e1 11728 1726882196.48275: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882196.48352: no more pending results, returning what we have 11728 1726882196.48355: results queue empty 11728 1726882196.48356: checking for any_errors_fatal 11728 1726882196.48361: done checking for any_errors_fatal 11728 1726882196.48362: checking for max_fail_percentage 11728 1726882196.48364: done checking for max_fail_percentage 11728 1726882196.48364: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.48365: done checking to see if all hosts have failed 11728 1726882196.48366: getting the remaining hosts for this loop 11728 1726882196.48370: done getting the remaining hosts for this loop 11728 1726882196.48374: getting the next task for host managed_node3 11728 1726882196.48383: done getting next task for host managed_node3 11728 1726882196.48387: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11728 1726882196.48395: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.48399: getting variables 11728 1726882196.48401: in VariableManager get_vars() 11728 1726882196.48434: Calling all_inventory to load vars for managed_node3 11728 1726882196.48437: Calling groups_inventory to load vars for managed_node3 11728 1726882196.48440: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.48448: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.48451: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.48453: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.49765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.50978: done with get_vars() 11728 1726882196.51001: done getting variables 11728 1726882196.51053: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.51372: variable 'profile' from source: include params 11728 1726882196.51375: variable 'bond_port_profile' from source: include params 11728 1726882196.51432: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:56 -0400 (0:00:00.049) 0:00:21.367 ****** 11728 1726882196.51464: entering _queue_task() for managed_node3/assert 11728 1726882196.51915: worker is 1 (out of 1 available) 11728 1726882196.51928: exiting _queue_task() for managed_node3/assert 11728 1726882196.51942: done queuing things up, now waiting for results queue to drain 11728 1726882196.51943: waiting for pending results... 11728 1726882196.52157: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11728 1726882196.52236: in run() - task 12673a56-9f93-5c28-a762-0000000004e2 11728 1726882196.52249: variable 'ansible_search_path' from source: unknown 11728 1726882196.52252: variable 'ansible_search_path' from source: unknown 11728 1726882196.52281: calling self._execute() 11728 1726882196.52355: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.52358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.52367: variable 'omit' from source: magic vars 11728 1726882196.52627: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.52637: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.52642: variable 'omit' from source: magic vars 11728 1726882196.52679: variable 'omit' from source: magic vars 11728 1726882196.52750: variable 'profile' from source: include params 11728 1726882196.52756: variable 'bond_port_profile' from source: include params 11728 1726882196.52803: variable 'bond_port_profile' from source: include params 11728 1726882196.52818: variable 'omit' from source: magic vars 11728 1726882196.52851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.52880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.52899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.52912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.52922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.52947: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.52950: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.52953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.53021: Set connection var ansible_connection to ssh 11728 1726882196.53029: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.53035: Set connection var ansible_timeout to 10 11728 1726882196.53039: Set connection var ansible_shell_type to sh 11728 1726882196.53045: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.53049: Set connection var ansible_pipelining to False 11728 1726882196.53068: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.53071: variable 'ansible_connection' from source: unknown 11728 1726882196.53073: variable 'ansible_module_compression' from source: unknown 11728 1726882196.53076: variable 'ansible_shell_type' from source: unknown 11728 1726882196.53078: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.53080: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.53085: variable 'ansible_pipelining' from source: unknown 11728 1726882196.53088: variable 'ansible_timeout' from source: unknown 11728 1726882196.53090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.53188: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882196.53199: variable 'omit' from source: magic vars 11728 1726882196.53211: starting attempt loop 11728 1726882196.53214: running the handler 11728 1726882196.53285: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11728 1726882196.53289: Evaluated conditional (lsr_net_profile_ansible_managed): True 11728 1726882196.53298: handler run complete 11728 1726882196.53309: attempt loop complete, returning result 11728 1726882196.53313: _execute() done 11728 1726882196.53316: dumping result to json 11728 1726882196.53318: done dumping result, returning 11728 1726882196.53330: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12673a56-9f93-5c28-a762-0000000004e2] 11728 1726882196.53332: sending task result for task 12673a56-9f93-5c28-a762-0000000004e2 11728 1726882196.53410: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e2 11728 1726882196.53413: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882196.53466: no more pending results, returning what we have 11728 1726882196.53469: results queue empty 11728 1726882196.53470: checking for any_errors_fatal 11728 1726882196.53478: done checking for any_errors_fatal 11728 1726882196.53479: checking for max_fail_percentage 11728 1726882196.53480: done checking for max_fail_percentage 11728 1726882196.53481: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.53482: done checking to see if all hosts have failed 11728 1726882196.53483: getting the remaining hosts for this loop 11728 1726882196.53484: done getting the remaining hosts for this loop 11728 1726882196.53487: getting the next task for host managed_node3 11728 1726882196.53497: done getting next task for host managed_node3 11728 1726882196.53500: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11728 1726882196.53504: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.53507: getting variables 11728 1726882196.53508: in VariableManager get_vars() 11728 1726882196.53539: Calling all_inventory to load vars for managed_node3 11728 1726882196.53541: Calling groups_inventory to load vars for managed_node3 11728 1726882196.53544: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.53553: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.53555: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.53557: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.54320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.55712: done with get_vars() 11728 1726882196.55733: done getting variables 11728 1726882196.55791: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882196.56103: variable 'profile' from source: include params 11728 1726882196.56107: variable 'bond_port_profile' from source: include params 11728 1726882196.56162: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:56 -0400 (0:00:00.047) 0:00:21.414 ****** 11728 1726882196.56192: entering _queue_task() for managed_node3/assert 11728 1726882196.56913: worker is 1 (out of 1 available) 11728 1726882196.56925: exiting _queue_task() for managed_node3/assert 11728 1726882196.56937: done queuing things up, now waiting for results queue to drain 11728 1726882196.56938: waiting for pending results... 11728 1726882196.57377: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 11728 1726882196.57382: in run() - task 12673a56-9f93-5c28-a762-0000000004e3 11728 1726882196.57385: variable 'ansible_search_path' from source: unknown 11728 1726882196.57388: variable 'ansible_search_path' from source: unknown 11728 1726882196.57391: calling self._execute() 11728 1726882196.57425: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.57429: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.57441: variable 'omit' from source: magic vars 11728 1726882196.57825: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.57839: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.57843: variable 'omit' from source: magic vars 11728 1726882196.57900: variable 'omit' from source: magic vars 11728 1726882196.58001: variable 'profile' from source: include params 11728 1726882196.58006: variable 'bond_port_profile' from source: include params 11728 1726882196.58073: variable 'bond_port_profile' from source: include params 11728 1726882196.58096: variable 'omit' from source: magic vars 11728 1726882196.58138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.58176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.58199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.58218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.58229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.58261: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.58264: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.58273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.58369: Set connection var ansible_connection to ssh 11728 1726882196.58381: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.58384: Set connection var ansible_timeout to 10 11728 1726882196.58386: Set connection var ansible_shell_type to sh 11728 1726882196.58401: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.58405: Set connection var ansible_pipelining to False 11728 1726882196.58432: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.58435: variable 'ansible_connection' from source: unknown 11728 1726882196.58437: variable 'ansible_module_compression' from source: unknown 11728 1726882196.58440: variable 'ansible_shell_type' from source: unknown 11728 1726882196.58442: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.58444: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.58446: variable 'ansible_pipelining' from source: unknown 11728 1726882196.58449: variable 'ansible_timeout' from source: unknown 11728 1726882196.58454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.58600: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882196.58609: variable 'omit' from source: magic vars 11728 1726882196.58615: starting attempt loop 11728 1726882196.58618: running the handler 11728 1726882196.58729: variable 'lsr_net_profile_fingerprint' from source: set_fact 11728 1726882196.58732: Evaluated conditional (lsr_net_profile_fingerprint): True 11728 1726882196.58739: handler run complete 11728 1726882196.58754: attempt loop complete, returning result 11728 1726882196.58756: _execute() done 11728 1726882196.58759: dumping result to json 11728 1726882196.58762: done dumping result, returning 11728 1726882196.58769: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [12673a56-9f93-5c28-a762-0000000004e3] 11728 1726882196.58818: sending task result for task 12673a56-9f93-5c28-a762-0000000004e3 11728 1726882196.58885: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e3 11728 1726882196.58888: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882196.58940: no more pending results, returning what we have 11728 1726882196.58944: results queue empty 11728 1726882196.58945: checking for any_errors_fatal 11728 1726882196.58952: done checking for any_errors_fatal 11728 1726882196.58952: checking for max_fail_percentage 11728 1726882196.58954: done checking for max_fail_percentage 11728 1726882196.58955: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.58956: done checking to see if all hosts have failed 11728 1726882196.58956: getting the remaining hosts for this loop 11728 1726882196.58958: done getting the remaining hosts for this loop 11728 1726882196.58961: getting the next task for host managed_node3 11728 1726882196.58973: done getting next task for host managed_node3 11728 1726882196.58977: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11728 1726882196.58981: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.58985: getting variables 11728 1726882196.58987: in VariableManager get_vars() 11728 1726882196.59023: Calling all_inventory to load vars for managed_node3 11728 1726882196.59026: Calling groups_inventory to load vars for managed_node3 11728 1726882196.59031: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.59042: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.59045: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.59048: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.61681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.63382: done with get_vars() 11728 1726882196.63403: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:56 -0400 (0:00:00.072) 0:00:21.487 ****** 11728 1726882196.63478: entering _queue_task() for managed_node3/include_tasks 11728 1726882196.63719: worker is 1 (out of 1 available) 11728 1726882196.63732: exiting _queue_task() for managed_node3/include_tasks 11728 1726882196.63744: done queuing things up, now waiting for results queue to drain 11728 1726882196.63746: waiting for pending results... 11728 1726882196.63922: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11728 1726882196.64011: in run() - task 12673a56-9f93-5c28-a762-0000000004e7 11728 1726882196.64024: variable 'ansible_search_path' from source: unknown 11728 1726882196.64027: variable 'ansible_search_path' from source: unknown 11728 1726882196.64055: calling self._execute() 11728 1726882196.64126: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.64130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.64138: variable 'omit' from source: magic vars 11728 1726882196.64406: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.64418: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.64423: _execute() done 11728 1726882196.64425: dumping result to json 11728 1726882196.64428: done dumping result, returning 11728 1726882196.64434: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-5c28-a762-0000000004e7] 11728 1726882196.64440: sending task result for task 12673a56-9f93-5c28-a762-0000000004e7 11728 1726882196.64528: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e7 11728 1726882196.64531: WORKER PROCESS EXITING 11728 1726882196.64563: no more pending results, returning what we have 11728 1726882196.64567: in VariableManager get_vars() 11728 1726882196.64605: Calling all_inventory to load vars for managed_node3 11728 1726882196.64608: Calling groups_inventory to load vars for managed_node3 11728 1726882196.64612: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.64623: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.64626: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.64628: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.66274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.67211: done with get_vars() 11728 1726882196.67225: variable 'ansible_search_path' from source: unknown 11728 1726882196.67226: variable 'ansible_search_path' from source: unknown 11728 1726882196.67251: we have included files to process 11728 1726882196.67251: generating all_blocks data 11728 1726882196.67253: done generating all_blocks data 11728 1726882196.67256: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882196.67257: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882196.67258: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11728 1726882196.67863: done processing included file 11728 1726882196.67864: iterating over new_blocks loaded from include file 11728 1726882196.67865: in VariableManager get_vars() 11728 1726882196.67876: done with get_vars() 11728 1726882196.67877: filtering new block on tags 11728 1726882196.67924: done filtering new block on tags 11728 1726882196.67926: in VariableManager get_vars() 11728 1726882196.67938: done with get_vars() 11728 1726882196.67939: filtering new block on tags 11728 1726882196.67976: done filtering new block on tags 11728 1726882196.67977: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11728 1726882196.67981: extending task lists for all hosts with included blocks 11728 1726882196.68234: done extending task lists 11728 1726882196.68235: done processing included files 11728 1726882196.68236: results queue empty 11728 1726882196.68236: checking for any_errors_fatal 11728 1726882196.68239: done checking for any_errors_fatal 11728 1726882196.68239: checking for max_fail_percentage 11728 1726882196.68240: done checking for max_fail_percentage 11728 1726882196.68240: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.68241: done checking to see if all hosts have failed 11728 1726882196.68241: getting the remaining hosts for this loop 11728 1726882196.68242: done getting the remaining hosts for this loop 11728 1726882196.68244: getting the next task for host managed_node3 11728 1726882196.68247: done getting next task for host managed_node3 11728 1726882196.68248: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882196.68251: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.68252: getting variables 11728 1726882196.68253: in VariableManager get_vars() 11728 1726882196.68261: Calling all_inventory to load vars for managed_node3 11728 1726882196.68263: Calling groups_inventory to load vars for managed_node3 11728 1726882196.68264: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.68268: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.68270: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.68271: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.68970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.70499: done with get_vars() 11728 1726882196.70524: done getting variables 11728 1726882196.70568: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:56 -0400 (0:00:00.071) 0:00:21.558 ****** 11728 1726882196.70606: entering _queue_task() for managed_node3/set_fact 11728 1726882196.70965: worker is 1 (out of 1 available) 11728 1726882196.70977: exiting _queue_task() for managed_node3/set_fact 11728 1726882196.70989: done queuing things up, now waiting for results queue to drain 11728 1726882196.70991: waiting for pending results... 11728 1726882196.71516: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11728 1726882196.71524: in run() - task 12673a56-9f93-5c28-a762-0000000005b4 11728 1726882196.71527: variable 'ansible_search_path' from source: unknown 11728 1726882196.71530: variable 'ansible_search_path' from source: unknown 11728 1726882196.71533: calling self._execute() 11728 1726882196.71653: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.71657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.71659: variable 'omit' from source: magic vars 11728 1726882196.71980: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.71987: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.72042: variable 'omit' from source: magic vars 11728 1726882196.72061: variable 'omit' from source: magic vars 11728 1726882196.72098: variable 'omit' from source: magic vars 11728 1726882196.72138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.72183: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.72208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.72227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.72240: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.72277: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.72280: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.72283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.72389: Set connection var ansible_connection to ssh 11728 1726882196.72411: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.72414: Set connection var ansible_timeout to 10 11728 1726882196.72416: Set connection var ansible_shell_type to sh 11728 1726882196.72418: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.72421: Set connection var ansible_pipelining to False 11728 1726882196.72446: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.72449: variable 'ansible_connection' from source: unknown 11728 1726882196.72452: variable 'ansible_module_compression' from source: unknown 11728 1726882196.72454: variable 'ansible_shell_type' from source: unknown 11728 1726882196.72457: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.72459: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.72500: variable 'ansible_pipelining' from source: unknown 11728 1726882196.72503: variable 'ansible_timeout' from source: unknown 11728 1726882196.72506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.72631: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882196.72635: variable 'omit' from source: magic vars 11728 1726882196.72700: starting attempt loop 11728 1726882196.72704: running the handler 11728 1726882196.72707: handler run complete 11728 1726882196.72709: attempt loop complete, returning result 11728 1726882196.72711: _execute() done 11728 1726882196.72713: dumping result to json 11728 1726882196.72716: done dumping result, returning 11728 1726882196.72718: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-5c28-a762-0000000005b4] 11728 1726882196.72720: sending task result for task 12673a56-9f93-5c28-a762-0000000005b4 11728 1726882196.72785: done sending task result for task 12673a56-9f93-5c28-a762-0000000005b4 11728 1726882196.72788: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11728 1726882196.72850: no more pending results, returning what we have 11728 1726882196.72855: results queue empty 11728 1726882196.72856: checking for any_errors_fatal 11728 1726882196.72858: done checking for any_errors_fatal 11728 1726882196.72859: checking for max_fail_percentage 11728 1726882196.72861: done checking for max_fail_percentage 11728 1726882196.72861: checking to see if all hosts have failed and the running result is not ok 11728 1726882196.72862: done checking to see if all hosts have failed 11728 1726882196.72863: getting the remaining hosts for this loop 11728 1726882196.72864: done getting the remaining hosts for this loop 11728 1726882196.72868: getting the next task for host managed_node3 11728 1726882196.72876: done getting next task for host managed_node3 11728 1726882196.72878: ^ task is: TASK: Stat profile file 11728 1726882196.72884: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882196.72888: getting variables 11728 1726882196.72889: in VariableManager get_vars() 11728 1726882196.72925: Calling all_inventory to load vars for managed_node3 11728 1726882196.72928: Calling groups_inventory to load vars for managed_node3 11728 1726882196.72932: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882196.72943: Calling all_plugins_play to load vars for managed_node3 11728 1726882196.72946: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882196.72949: Calling groups_plugins_play to load vars for managed_node3 11728 1726882196.74462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882196.75956: done with get_vars() 11728 1726882196.75980: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:56 -0400 (0:00:00.054) 0:00:21.613 ****** 11728 1726882196.76085: entering _queue_task() for managed_node3/stat 11728 1726882196.76422: worker is 1 (out of 1 available) 11728 1726882196.76435: exiting _queue_task() for managed_node3/stat 11728 1726882196.76447: done queuing things up, now waiting for results queue to drain 11728 1726882196.76448: waiting for pending results... 11728 1726882196.76822: running TaskExecutor() for managed_node3/TASK: Stat profile file 11728 1726882196.76914: in run() - task 12673a56-9f93-5c28-a762-0000000005b5 11728 1726882196.76920: variable 'ansible_search_path' from source: unknown 11728 1726882196.76922: variable 'ansible_search_path' from source: unknown 11728 1726882196.76925: calling self._execute() 11728 1726882196.77200: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.77204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.77207: variable 'omit' from source: magic vars 11728 1726882196.77389: variable 'ansible_distribution_major_version' from source: facts 11728 1726882196.77406: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882196.77411: variable 'omit' from source: magic vars 11728 1726882196.77472: variable 'omit' from source: magic vars 11728 1726882196.77678: variable 'profile' from source: include params 11728 1726882196.77681: variable 'bond_port_profile' from source: include params 11728 1726882196.77683: variable 'bond_port_profile' from source: include params 11728 1726882196.77686: variable 'omit' from source: magic vars 11728 1726882196.77713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882196.77749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882196.77769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882196.77786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.77804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882196.78004: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882196.78007: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.78011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.78013: Set connection var ansible_connection to ssh 11728 1726882196.78015: Set connection var ansible_shell_executable to /bin/sh 11728 1726882196.78018: Set connection var ansible_timeout to 10 11728 1726882196.78021: Set connection var ansible_shell_type to sh 11728 1726882196.78025: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882196.78029: Set connection var ansible_pipelining to False 11728 1726882196.78031: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.78034: variable 'ansible_connection' from source: unknown 11728 1726882196.78037: variable 'ansible_module_compression' from source: unknown 11728 1726882196.78039: variable 'ansible_shell_type' from source: unknown 11728 1726882196.78042: variable 'ansible_shell_executable' from source: unknown 11728 1726882196.78044: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882196.78047: variable 'ansible_pipelining' from source: unknown 11728 1726882196.78050: variable 'ansible_timeout' from source: unknown 11728 1726882196.78053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882196.78252: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882196.78277: variable 'omit' from source: magic vars 11728 1726882196.78280: starting attempt loop 11728 1726882196.78282: running the handler 11728 1726882196.78288: _low_level_execute_command(): starting 11728 1726882196.78387: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882196.79030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882196.79041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882196.79115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882196.79118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882196.79121: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882196.79124: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882196.79126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.79128: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882196.79225: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.79230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882196.79232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.79246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.79338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.80926: stdout chunk (state=3): >>>/root <<< 11728 1726882196.81119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.81123: stdout chunk (state=3): >>><<< 11728 1726882196.81133: stderr chunk (state=3): >>><<< 11728 1726882196.81302: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882196.81305: _low_level_execute_command(): starting 11728 1726882196.81309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027 `" && echo ansible-tmp-1726882196.8120437-12792-262738482834027="` echo /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027 `" ) && sleep 0' 11728 1726882196.82418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882196.82430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882196.82433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.82435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882196.82437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882196.82439: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.82646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.82757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.84602: stdout chunk (state=3): >>>ansible-tmp-1726882196.8120437-12792-262738482834027=/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027 <<< 11728 1726882196.84826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.85099: stderr chunk (state=3): >>><<< 11728 1726882196.85103: stdout chunk (state=3): >>><<< 11728 1726882196.85106: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882196.8120437-12792-262738482834027=/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882196.85108: variable 'ansible_module_compression' from source: unknown 11728 1726882196.85110: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882196.85112: variable 'ansible_facts' from source: unknown 11728 1726882196.85503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py 11728 1726882196.85695: Sending initial data 11728 1726882196.85777: Sent initial data (153 bytes) 11728 1726882196.86926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882196.86939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.86952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.87122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882196.87134: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.87196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.88678: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882196.88691: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882196.88727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882196.88838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py" <<< 11728 1726882196.88841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbifgcmle /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py <<< 11728 1726882196.88911: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbifgcmle" to remote "/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py" <<< 11728 1726882196.90190: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.90268: stderr chunk (state=3): >>><<< 11728 1726882196.90278: stdout chunk (state=3): >>><<< 11728 1726882196.90403: done transferring module to remote 11728 1726882196.90406: _low_level_execute_command(): starting 11728 1726882196.90480: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/ /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py && sleep 0' 11728 1726882196.91476: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882196.91489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882196.91512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882196.91530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882196.91547: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882196.91611: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882196.91682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882196.91694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.91717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.91780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882196.93558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882196.93561: stdout chunk (state=3): >>><<< 11728 1726882196.93563: stderr chunk (state=3): >>><<< 11728 1726882196.93783: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882196.93786: _low_level_execute_command(): starting 11728 1726882196.93788: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/AnsiballZ_stat.py && sleep 0' 11728 1726882196.94399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882196.94406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882196.94409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882196.94479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.09373: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882197.10598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882197.10626: stderr chunk (state=3): >>><<< 11728 1726882197.10629: stdout chunk (state=3): >>><<< 11728 1726882197.10650: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882197.10675: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882197.10682: _low_level_execute_command(): starting 11728 1726882197.10687: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882196.8120437-12792-262738482834027/ > /dev/null 2>&1 && sleep 0' 11728 1726882197.11163: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882197.11167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882197.11169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.11171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882197.11177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.11228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.11231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.11283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.13074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.13087: stderr chunk (state=3): >>><<< 11728 1726882197.13122: stdout chunk (state=3): >>><<< 11728 1726882197.13130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882197.13146: handler run complete 11728 1726882197.13168: attempt loop complete, returning result 11728 1726882197.13171: _execute() done 11728 1726882197.13173: dumping result to json 11728 1726882197.13175: done dumping result, returning 11728 1726882197.13184: done running TaskExecutor() for managed_node3/TASK: Stat profile file [12673a56-9f93-5c28-a762-0000000005b5] 11728 1726882197.13186: sending task result for task 12673a56-9f93-5c28-a762-0000000005b5 11728 1726882197.13307: done sending task result for task 12673a56-9f93-5c28-a762-0000000005b5 11728 1726882197.13313: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11728 1726882197.13404: no more pending results, returning what we have 11728 1726882197.13408: results queue empty 11728 1726882197.13409: checking for any_errors_fatal 11728 1726882197.13424: done checking for any_errors_fatal 11728 1726882197.13425: checking for max_fail_percentage 11728 1726882197.13426: done checking for max_fail_percentage 11728 1726882197.13427: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.13428: done checking to see if all hosts have failed 11728 1726882197.13429: getting the remaining hosts for this loop 11728 1726882197.13430: done getting the remaining hosts for this loop 11728 1726882197.13434: getting the next task for host managed_node3 11728 1726882197.13442: done getting next task for host managed_node3 11728 1726882197.13445: ^ task is: TASK: Set NM profile exist flag based on the profile files 11728 1726882197.13452: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.13455: getting variables 11728 1726882197.13456: in VariableManager get_vars() 11728 1726882197.13491: Calling all_inventory to load vars for managed_node3 11728 1726882197.13496: Calling groups_inventory to load vars for managed_node3 11728 1726882197.13499: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.13509: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.13512: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.13515: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.14640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.15915: done with get_vars() 11728 1726882197.15941: done getting variables 11728 1726882197.16045: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:57 -0400 (0:00:00.399) 0:00:22.013 ****** 11728 1726882197.16086: entering _queue_task() for managed_node3/set_fact 11728 1726882197.16384: worker is 1 (out of 1 available) 11728 1726882197.16402: exiting _queue_task() for managed_node3/set_fact 11728 1726882197.16414: done queuing things up, now waiting for results queue to drain 11728 1726882197.16418: waiting for pending results... 11728 1726882197.16599: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11728 1726882197.16681: in run() - task 12673a56-9f93-5c28-a762-0000000005b6 11728 1726882197.16697: variable 'ansible_search_path' from source: unknown 11728 1726882197.16701: variable 'ansible_search_path' from source: unknown 11728 1726882197.16728: calling self._execute() 11728 1726882197.16804: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.16807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.16817: variable 'omit' from source: magic vars 11728 1726882197.17085: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.17105: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.17198: variable 'profile_stat' from source: set_fact 11728 1726882197.17209: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882197.17212: when evaluation is False, skipping this task 11728 1726882197.17215: _execute() done 11728 1726882197.17217: dumping result to json 11728 1726882197.17221: done dumping result, returning 11728 1726882197.17228: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-5c28-a762-0000000005b6] 11728 1726882197.17233: sending task result for task 12673a56-9f93-5c28-a762-0000000005b6 11728 1726882197.17318: done sending task result for task 12673a56-9f93-5c28-a762-0000000005b6 11728 1726882197.17321: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882197.17367: no more pending results, returning what we have 11728 1726882197.17371: results queue empty 11728 1726882197.17372: checking for any_errors_fatal 11728 1726882197.17381: done checking for any_errors_fatal 11728 1726882197.17381: checking for max_fail_percentage 11728 1726882197.17383: done checking for max_fail_percentage 11728 1726882197.17384: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.17385: done checking to see if all hosts have failed 11728 1726882197.17385: getting the remaining hosts for this loop 11728 1726882197.17387: done getting the remaining hosts for this loop 11728 1726882197.17391: getting the next task for host managed_node3 11728 1726882197.17400: done getting next task for host managed_node3 11728 1726882197.17403: ^ task is: TASK: Get NM profile info 11728 1726882197.17408: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.17412: getting variables 11728 1726882197.17413: in VariableManager get_vars() 11728 1726882197.17447: Calling all_inventory to load vars for managed_node3 11728 1726882197.17450: Calling groups_inventory to load vars for managed_node3 11728 1726882197.17453: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.17465: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.17467: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.17470: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.26578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.29405: done with get_vars() 11728 1726882197.29431: done getting variables 11728 1726882197.29480: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:57 -0400 (0:00:00.134) 0:00:22.147 ****** 11728 1726882197.29514: entering _queue_task() for managed_node3/shell 11728 1726882197.29844: worker is 1 (out of 1 available) 11728 1726882197.29857: exiting _queue_task() for managed_node3/shell 11728 1726882197.29869: done queuing things up, now waiting for results queue to drain 11728 1726882197.29870: waiting for pending results... 11728 1726882197.30312: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11728 1726882197.30317: in run() - task 12673a56-9f93-5c28-a762-0000000005b7 11728 1726882197.30319: variable 'ansible_search_path' from source: unknown 11728 1726882197.30321: variable 'ansible_search_path' from source: unknown 11728 1726882197.30324: calling self._execute() 11728 1726882197.30391: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.30410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.30427: variable 'omit' from source: magic vars 11728 1726882197.30787: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.30807: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.30818: variable 'omit' from source: magic vars 11728 1726882197.30886: variable 'omit' from source: magic vars 11728 1726882197.30989: variable 'profile' from source: include params 11728 1726882197.31002: variable 'bond_port_profile' from source: include params 11728 1726882197.31071: variable 'bond_port_profile' from source: include params 11728 1726882197.31096: variable 'omit' from source: magic vars 11728 1726882197.31141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882197.31187: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882197.31213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882197.31235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882197.31251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882197.31290: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882197.31301: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.31311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.31412: Set connection var ansible_connection to ssh 11728 1726882197.31428: Set connection var ansible_shell_executable to /bin/sh 11728 1726882197.31440: Set connection var ansible_timeout to 10 11728 1726882197.31447: Set connection var ansible_shell_type to sh 11728 1726882197.31459: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882197.31469: Set connection var ansible_pipelining to False 11728 1726882197.31502: variable 'ansible_shell_executable' from source: unknown 11728 1726882197.31510: variable 'ansible_connection' from source: unknown 11728 1726882197.31518: variable 'ansible_module_compression' from source: unknown 11728 1726882197.31900: variable 'ansible_shell_type' from source: unknown 11728 1726882197.31904: variable 'ansible_shell_executable' from source: unknown 11728 1726882197.31906: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.31908: variable 'ansible_pipelining' from source: unknown 11728 1726882197.31911: variable 'ansible_timeout' from source: unknown 11728 1726882197.31912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.32038: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882197.32055: variable 'omit' from source: magic vars 11728 1726882197.32065: starting attempt loop 11728 1726882197.32072: running the handler 11728 1726882197.32086: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882197.32137: _low_level_execute_command(): starting 11728 1726882197.32226: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882197.33125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882197.33215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.33251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.33268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882197.33295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.33454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.35120: stdout chunk (state=3): >>>/root <<< 11728 1726882197.35272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.35278: stdout chunk (state=3): >>><<< 11728 1726882197.35280: stderr chunk (state=3): >>><<< 11728 1726882197.35513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882197.35517: _low_level_execute_command(): starting 11728 1726882197.35520: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754 `" && echo ansible-tmp-1726882197.354199-12818-40308954165754="` echo /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754 `" ) && sleep 0' 11728 1726882197.36283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.36332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.36354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882197.36371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.36458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.38491: stdout chunk (state=3): >>>ansible-tmp-1726882197.354199-12818-40308954165754=/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754 <<< 11728 1726882197.38498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.38500: stdout chunk (state=3): >>><<< 11728 1726882197.38503: stderr chunk (state=3): >>><<< 11728 1726882197.38745: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882197.354199-12818-40308954165754=/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882197.38749: variable 'ansible_module_compression' from source: unknown 11728 1726882197.38751: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882197.38764: variable 'ansible_facts' from source: unknown 11728 1726882197.38856: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py 11728 1726882197.39197: Sending initial data 11728 1726882197.39208: Sent initial data (154 bytes) 11728 1726882197.39641: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882197.39656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882197.39668: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.39733: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.39796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.41331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882197.41371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882197.41426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp6jaycy1b /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py <<< 11728 1726882197.41430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py" <<< 11728 1726882197.41462: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp6jaycy1b" to remote "/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py" <<< 11728 1726882197.42812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.42816: stdout chunk (state=3): >>><<< 11728 1726882197.42818: stderr chunk (state=3): >>><<< 11728 1726882197.42822: done transferring module to remote 11728 1726882197.42824: _low_level_execute_command(): starting 11728 1726882197.42827: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/ /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py && sleep 0' 11728 1726882197.43386: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882197.43405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882197.43421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882197.43438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882197.43456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882197.43514: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.43562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.43592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.43660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.45623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.45627: stdout chunk (state=3): >>><<< 11728 1726882197.45629: stderr chunk (state=3): >>><<< 11728 1726882197.45632: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882197.45634: _low_level_execute_command(): starting 11728 1726882197.45637: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/AnsiballZ_command.py && sleep 0' 11728 1726882197.46757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882197.46760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882197.46763: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.46766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882197.46768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.46989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.46992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882197.47010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.47191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.64142: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:57.619071", "end": "2024-09-20 21:29:57.639776", "delta": "0:00:00.020705", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882197.65718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.65751: stderr chunk (state=3): >>>Shared connection to 10.31.10.229 closed. <<< 11728 1726882197.65755: stdout chunk (state=3): >>><<< 11728 1726882197.65757: stderr chunk (state=3): >>><<< 11728 1726882197.65775: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:57.619071", "end": "2024-09-20 21:29:57.639776", "delta": "0:00:00.020705", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882197.65900: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882197.65904: _low_level_execute_command(): starting 11728 1726882197.65907: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882197.354199-12818-40308954165754/ > /dev/null 2>&1 && sleep 0' 11728 1726882197.66443: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882197.66460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882197.66472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882197.66489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882197.66509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882197.66519: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882197.66530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.66549: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882197.66559: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882197.66568: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882197.66609: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882197.66657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882197.66671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882197.66697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882197.66766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882197.68612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882197.68629: stderr chunk (state=3): >>><<< 11728 1726882197.68635: stdout chunk (state=3): >>><<< 11728 1726882197.68665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882197.68672: handler run complete 11728 1726882197.68699: Evaluated conditional (False): False 11728 1726882197.68712: attempt loop complete, returning result 11728 1726882197.68715: _execute() done 11728 1726882197.68718: dumping result to json 11728 1726882197.68723: done dumping result, returning 11728 1726882197.68732: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [12673a56-9f93-5c28-a762-0000000005b7] 11728 1726882197.68737: sending task result for task 12673a56-9f93-5c28-a762-0000000005b7 11728 1726882197.68989: done sending task result for task 12673a56-9f93-5c28-a762-0000000005b7 11728 1726882197.68992: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020705", "end": "2024-09-20 21:29:57.639776", "rc": 0, "start": "2024-09-20 21:29:57.619071" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11728 1726882197.69081: no more pending results, returning what we have 11728 1726882197.69085: results queue empty 11728 1726882197.69086: checking for any_errors_fatal 11728 1726882197.69097: done checking for any_errors_fatal 11728 1726882197.69098: checking for max_fail_percentage 11728 1726882197.69100: done checking for max_fail_percentage 11728 1726882197.69101: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.69102: done checking to see if all hosts have failed 11728 1726882197.69103: getting the remaining hosts for this loop 11728 1726882197.69105: done getting the remaining hosts for this loop 11728 1726882197.69109: getting the next task for host managed_node3 11728 1726882197.69118: done getting next task for host managed_node3 11728 1726882197.69121: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882197.69128: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.69132: getting variables 11728 1726882197.69134: in VariableManager get_vars() 11728 1726882197.69168: Calling all_inventory to load vars for managed_node3 11728 1726882197.69171: Calling groups_inventory to load vars for managed_node3 11728 1726882197.69175: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.69186: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.69189: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.69192: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.70776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.72853: done with get_vars() 11728 1726882197.72874: done getting variables 11728 1726882197.72943: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:57 -0400 (0:00:00.434) 0:00:22.582 ****** 11728 1726882197.72981: entering _queue_task() for managed_node3/set_fact 11728 1726882197.73348: worker is 1 (out of 1 available) 11728 1726882197.73359: exiting _queue_task() for managed_node3/set_fact 11728 1726882197.73489: done queuing things up, now waiting for results queue to drain 11728 1726882197.73492: waiting for pending results... 11728 1726882197.73705: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11728 1726882197.73859: in run() - task 12673a56-9f93-5c28-a762-0000000005b8 11728 1726882197.73920: variable 'ansible_search_path' from source: unknown 11728 1726882197.73924: variable 'ansible_search_path' from source: unknown 11728 1726882197.73930: calling self._execute() 11728 1726882197.74021: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.74025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.74153: variable 'omit' from source: magic vars 11728 1726882197.74471: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.74497: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.74640: variable 'nm_profile_exists' from source: set_fact 11728 1726882197.74655: Evaluated conditional (nm_profile_exists.rc == 0): True 11728 1726882197.74659: variable 'omit' from source: magic vars 11728 1726882197.74730: variable 'omit' from source: magic vars 11728 1726882197.74762: variable 'omit' from source: magic vars 11728 1726882197.74812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882197.74850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882197.74917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882197.74921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882197.74923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882197.74938: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882197.74942: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.74944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.75054: Set connection var ansible_connection to ssh 11728 1726882197.75065: Set connection var ansible_shell_executable to /bin/sh 11728 1726882197.75071: Set connection var ansible_timeout to 10 11728 1726882197.75073: Set connection var ansible_shell_type to sh 11728 1726882197.75081: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882197.75098: Set connection var ansible_pipelining to False 11728 1726882197.75114: variable 'ansible_shell_executable' from source: unknown 11728 1726882197.75117: variable 'ansible_connection' from source: unknown 11728 1726882197.75120: variable 'ansible_module_compression' from source: unknown 11728 1726882197.75122: variable 'ansible_shell_type' from source: unknown 11728 1726882197.75241: variable 'ansible_shell_executable' from source: unknown 11728 1726882197.75244: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.75246: variable 'ansible_pipelining' from source: unknown 11728 1726882197.75248: variable 'ansible_timeout' from source: unknown 11728 1726882197.75251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.75299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882197.75312: variable 'omit' from source: magic vars 11728 1726882197.75318: starting attempt loop 11728 1726882197.75321: running the handler 11728 1726882197.75334: handler run complete 11728 1726882197.75353: attempt loop complete, returning result 11728 1726882197.75356: _execute() done 11728 1726882197.75359: dumping result to json 11728 1726882197.75361: done dumping result, returning 11728 1726882197.75371: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-5c28-a762-0000000005b8] 11728 1726882197.75376: sending task result for task 12673a56-9f93-5c28-a762-0000000005b8 11728 1726882197.75578: done sending task result for task 12673a56-9f93-5c28-a762-0000000005b8 11728 1726882197.75582: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11728 1726882197.75639: no more pending results, returning what we have 11728 1726882197.75642: results queue empty 11728 1726882197.75643: checking for any_errors_fatal 11728 1726882197.75650: done checking for any_errors_fatal 11728 1726882197.75651: checking for max_fail_percentage 11728 1726882197.75653: done checking for max_fail_percentage 11728 1726882197.75654: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.75655: done checking to see if all hosts have failed 11728 1726882197.75655: getting the remaining hosts for this loop 11728 1726882197.75657: done getting the remaining hosts for this loop 11728 1726882197.75661: getting the next task for host managed_node3 11728 1726882197.75827: done getting next task for host managed_node3 11728 1726882197.75831: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882197.75837: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.75840: getting variables 11728 1726882197.75841: in VariableManager get_vars() 11728 1726882197.75866: Calling all_inventory to load vars for managed_node3 11728 1726882197.75868: Calling groups_inventory to load vars for managed_node3 11728 1726882197.75871: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.75879: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.75881: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.75884: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.77179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.78805: done with get_vars() 11728 1726882197.78834: done getting variables 11728 1726882197.78892: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882197.79032: variable 'profile' from source: include params 11728 1726882197.79036: variable 'bond_port_profile' from source: include params 11728 1726882197.79102: variable 'bond_port_profile' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:57 -0400 (0:00:00.061) 0:00:22.643 ****** 11728 1726882197.79141: entering _queue_task() for managed_node3/command 11728 1726882197.79706: worker is 1 (out of 1 available) 11728 1726882197.79714: exiting _queue_task() for managed_node3/command 11728 1726882197.79724: done queuing things up, now waiting for results queue to drain 11728 1726882197.79725: waiting for pending results... 11728 1726882197.79782: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11728 1726882197.79996: in run() - task 12673a56-9f93-5c28-a762-0000000005ba 11728 1726882197.80000: variable 'ansible_search_path' from source: unknown 11728 1726882197.80003: variable 'ansible_search_path' from source: unknown 11728 1726882197.80006: calling self._execute() 11728 1726882197.80078: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.80084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.80098: variable 'omit' from source: magic vars 11728 1726882197.80480: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.80491: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.80626: variable 'profile_stat' from source: set_fact 11728 1726882197.80642: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882197.80645: when evaluation is False, skipping this task 11728 1726882197.80648: _execute() done 11728 1726882197.80650: dumping result to json 11728 1726882197.80653: done dumping result, returning 11728 1726882197.80655: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-5c28-a762-0000000005ba] 11728 1726882197.80699: sending task result for task 12673a56-9f93-5c28-a762-0000000005ba skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882197.81024: no more pending results, returning what we have 11728 1726882197.81028: results queue empty 11728 1726882197.81029: checking for any_errors_fatal 11728 1726882197.81034: done checking for any_errors_fatal 11728 1726882197.81035: checking for max_fail_percentage 11728 1726882197.81037: done checking for max_fail_percentage 11728 1726882197.81038: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.81038: done checking to see if all hosts have failed 11728 1726882197.81039: getting the remaining hosts for this loop 11728 1726882197.81040: done getting the remaining hosts for this loop 11728 1726882197.81048: getting the next task for host managed_node3 11728 1726882197.81054: done getting next task for host managed_node3 11728 1726882197.81056: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11728 1726882197.81062: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.81065: getting variables 11728 1726882197.81066: in VariableManager get_vars() 11728 1726882197.81092: Calling all_inventory to load vars for managed_node3 11728 1726882197.81097: Calling groups_inventory to load vars for managed_node3 11728 1726882197.81100: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.81108: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.81111: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.81113: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.81668: done sending task result for task 12673a56-9f93-5c28-a762-0000000005ba 11728 1726882197.81671: WORKER PROCESS EXITING 11728 1726882197.82620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.84217: done with get_vars() 11728 1726882197.84239: done getting variables 11728 1726882197.84304: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882197.84430: variable 'profile' from source: include params 11728 1726882197.84434: variable 'bond_port_profile' from source: include params 11728 1726882197.84498: variable 'bond_port_profile' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:57 -0400 (0:00:00.053) 0:00:22.697 ****** 11728 1726882197.84538: entering _queue_task() for managed_node3/set_fact 11728 1726882197.84974: worker is 1 (out of 1 available) 11728 1726882197.84985: exiting _queue_task() for managed_node3/set_fact 11728 1726882197.85000: done queuing things up, now waiting for results queue to drain 11728 1726882197.85001: waiting for pending results... 11728 1726882197.85211: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11728 1726882197.85413: in run() - task 12673a56-9f93-5c28-a762-0000000005bb 11728 1726882197.85418: variable 'ansible_search_path' from source: unknown 11728 1726882197.85420: variable 'ansible_search_path' from source: unknown 11728 1726882197.85454: calling self._execute() 11728 1726882197.85631: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.85634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.85637: variable 'omit' from source: magic vars 11728 1726882197.86001: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.86022: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.86156: variable 'profile_stat' from source: set_fact 11728 1726882197.86177: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882197.86184: when evaluation is False, skipping this task 11728 1726882197.86189: _execute() done 11728 1726882197.86200: dumping result to json 11728 1726882197.86211: done dumping result, returning 11728 1726882197.86220: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-5c28-a762-0000000005bb] 11728 1726882197.86228: sending task result for task 12673a56-9f93-5c28-a762-0000000005bb 11728 1726882197.86523: done sending task result for task 12673a56-9f93-5c28-a762-0000000005bb 11728 1726882197.86526: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882197.86565: no more pending results, returning what we have 11728 1726882197.86567: results queue empty 11728 1726882197.86568: checking for any_errors_fatal 11728 1726882197.86573: done checking for any_errors_fatal 11728 1726882197.86573: checking for max_fail_percentage 11728 1726882197.86575: done checking for max_fail_percentage 11728 1726882197.86576: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.86576: done checking to see if all hosts have failed 11728 1726882197.86577: getting the remaining hosts for this loop 11728 1726882197.86578: done getting the remaining hosts for this loop 11728 1726882197.86581: getting the next task for host managed_node3 11728 1726882197.86587: done getting next task for host managed_node3 11728 1726882197.86589: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11728 1726882197.86599: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.86602: getting variables 11728 1726882197.86603: in VariableManager get_vars() 11728 1726882197.86627: Calling all_inventory to load vars for managed_node3 11728 1726882197.86629: Calling groups_inventory to load vars for managed_node3 11728 1726882197.86631: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.86647: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.86649: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.86652: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.88084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.89752: done with get_vars() 11728 1726882197.89773: done getting variables 11728 1726882197.89842: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882197.89961: variable 'profile' from source: include params 11728 1726882197.89964: variable 'bond_port_profile' from source: include params 11728 1726882197.90032: variable 'bond_port_profile' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:57 -0400 (0:00:00.055) 0:00:22.753 ****** 11728 1726882197.90067: entering _queue_task() for managed_node3/command 11728 1726882197.90618: worker is 1 (out of 1 available) 11728 1726882197.90626: exiting _queue_task() for managed_node3/command 11728 1726882197.90636: done queuing things up, now waiting for results queue to drain 11728 1726882197.90637: waiting for pending results... 11728 1726882197.90686: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 11728 1726882197.90835: in run() - task 12673a56-9f93-5c28-a762-0000000005bc 11728 1726882197.90972: variable 'ansible_search_path' from source: unknown 11728 1726882197.90979: variable 'ansible_search_path' from source: unknown 11728 1726882197.90982: calling self._execute() 11728 1726882197.91024: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.91038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.91087: variable 'omit' from source: magic vars 11728 1726882197.91451: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.91466: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.91587: variable 'profile_stat' from source: set_fact 11728 1726882197.91607: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882197.91629: when evaluation is False, skipping this task 11728 1726882197.91631: _execute() done 11728 1726882197.91698: dumping result to json 11728 1726882197.91704: done dumping result, returning 11728 1726882197.91707: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-5c28-a762-0000000005bc] 11728 1726882197.91709: sending task result for task 12673a56-9f93-5c28-a762-0000000005bc 11728 1726882197.91775: done sending task result for task 12673a56-9f93-5c28-a762-0000000005bc 11728 1726882197.91778: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882197.91854: no more pending results, returning what we have 11728 1726882197.91860: results queue empty 11728 1726882197.91861: checking for any_errors_fatal 11728 1726882197.91867: done checking for any_errors_fatal 11728 1726882197.91868: checking for max_fail_percentage 11728 1726882197.91869: done checking for max_fail_percentage 11728 1726882197.91870: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.91871: done checking to see if all hosts have failed 11728 1726882197.91872: getting the remaining hosts for this loop 11728 1726882197.91873: done getting the remaining hosts for this loop 11728 1726882197.91876: getting the next task for host managed_node3 11728 1726882197.91884: done getting next task for host managed_node3 11728 1726882197.91886: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11728 1726882197.91897: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.91901: getting variables 11728 1726882197.91903: in VariableManager get_vars() 11728 1726882197.91931: Calling all_inventory to load vars for managed_node3 11728 1726882197.91933: Calling groups_inventory to load vars for managed_node3 11728 1726882197.91936: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.91948: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.91950: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.91953: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.93674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882197.95392: done with get_vars() 11728 1726882197.95419: done getting variables 11728 1726882197.95478: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882197.95615: variable 'profile' from source: include params 11728 1726882197.95619: variable 'bond_port_profile' from source: include params 11728 1726882197.95679: variable 'bond_port_profile' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:57 -0400 (0:00:00.056) 0:00:22.809 ****** 11728 1726882197.95723: entering _queue_task() for managed_node3/set_fact 11728 1726882197.96115: worker is 1 (out of 1 available) 11728 1726882197.96127: exiting _queue_task() for managed_node3/set_fact 11728 1726882197.96254: done queuing things up, now waiting for results queue to drain 11728 1726882197.96256: waiting for pending results... 11728 1726882197.96491: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11728 1726882197.96562: in run() - task 12673a56-9f93-5c28-a762-0000000005bd 11728 1726882197.96596: variable 'ansible_search_path' from source: unknown 11728 1726882197.96690: variable 'ansible_search_path' from source: unknown 11728 1726882197.96704: calling self._execute() 11728 1726882197.96753: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882197.96764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882197.96778: variable 'omit' from source: magic vars 11728 1726882197.97163: variable 'ansible_distribution_major_version' from source: facts 11728 1726882197.97178: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882197.97304: variable 'profile_stat' from source: set_fact 11728 1726882197.97320: Evaluated conditional (profile_stat.stat.exists): False 11728 1726882197.97327: when evaluation is False, skipping this task 11728 1726882197.97333: _execute() done 11728 1726882197.97349: dumping result to json 11728 1726882197.97359: done dumping result, returning 11728 1726882197.97368: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-5c28-a762-0000000005bd] 11728 1726882197.97377: sending task result for task 12673a56-9f93-5c28-a762-0000000005bd 11728 1726882197.97592: done sending task result for task 12673a56-9f93-5c28-a762-0000000005bd 11728 1726882197.97599: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11728 1726882197.97645: no more pending results, returning what we have 11728 1726882197.97649: results queue empty 11728 1726882197.97650: checking for any_errors_fatal 11728 1726882197.97658: done checking for any_errors_fatal 11728 1726882197.97658: checking for max_fail_percentage 11728 1726882197.97660: done checking for max_fail_percentage 11728 1726882197.97666: checking to see if all hosts have failed and the running result is not ok 11728 1726882197.97666: done checking to see if all hosts have failed 11728 1726882197.97667: getting the remaining hosts for this loop 11728 1726882197.97669: done getting the remaining hosts for this loop 11728 1726882197.97672: getting the next task for host managed_node3 11728 1726882197.97682: done getting next task for host managed_node3 11728 1726882197.97685: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11728 1726882197.97691: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882197.97877: getting variables 11728 1726882197.97879: in VariableManager get_vars() 11728 1726882197.97909: Calling all_inventory to load vars for managed_node3 11728 1726882197.97912: Calling groups_inventory to load vars for managed_node3 11728 1726882197.97915: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882197.97924: Calling all_plugins_play to load vars for managed_node3 11728 1726882197.97927: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882197.97930: Calling groups_plugins_play to load vars for managed_node3 11728 1726882197.99269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882198.01065: done with get_vars() 11728 1726882198.01088: done getting variables 11728 1726882198.01155: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882198.01278: variable 'profile' from source: include params 11728 1726882198.01282: variable 'bond_port_profile' from source: include params 11728 1726882198.01344: variable 'bond_port_profile' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:58 -0400 (0:00:00.056) 0:00:22.866 ****** 11728 1726882198.01374: entering _queue_task() for managed_node3/assert 11728 1726882198.01918: worker is 1 (out of 1 available) 11728 1726882198.01926: exiting _queue_task() for managed_node3/assert 11728 1726882198.01935: done queuing things up, now waiting for results queue to drain 11728 1726882198.01937: waiting for pending results... 11728 1726882198.02070: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 11728 1726882198.02141: in run() - task 12673a56-9f93-5c28-a762-0000000004e8 11728 1726882198.02175: variable 'ansible_search_path' from source: unknown 11728 1726882198.02183: variable 'ansible_search_path' from source: unknown 11728 1726882198.02227: calling self._execute() 11728 1726882198.02385: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.02389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.02392: variable 'omit' from source: magic vars 11728 1726882198.02855: variable 'ansible_distribution_major_version' from source: facts 11728 1726882198.02874: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882198.03000: variable 'omit' from source: magic vars 11728 1726882198.03003: variable 'omit' from source: magic vars 11728 1726882198.03158: variable 'profile' from source: include params 11728 1726882198.03168: variable 'bond_port_profile' from source: include params 11728 1726882198.03241: variable 'bond_port_profile' from source: include params 11728 1726882198.03276: variable 'omit' from source: magic vars 11728 1726882198.03322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882198.03371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882198.03402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882198.03426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.03443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.03587: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882198.03590: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.03596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.03615: Set connection var ansible_connection to ssh 11728 1726882198.03632: Set connection var ansible_shell_executable to /bin/sh 11728 1726882198.03642: Set connection var ansible_timeout to 10 11728 1726882198.03649: Set connection var ansible_shell_type to sh 11728 1726882198.03660: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882198.03670: Set connection var ansible_pipelining to False 11728 1726882198.03711: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.03720: variable 'ansible_connection' from source: unknown 11728 1726882198.03726: variable 'ansible_module_compression' from source: unknown 11728 1726882198.03732: variable 'ansible_shell_type' from source: unknown 11728 1726882198.03739: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.03745: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.03752: variable 'ansible_pipelining' from source: unknown 11728 1726882198.03759: variable 'ansible_timeout' from source: unknown 11728 1726882198.03767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.03935: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882198.03953: variable 'omit' from source: magic vars 11728 1726882198.03963: starting attempt loop 11728 1726882198.03969: running the handler 11728 1726882198.04097: variable 'lsr_net_profile_exists' from source: set_fact 11728 1726882198.04132: Evaluated conditional (lsr_net_profile_exists): True 11728 1726882198.04135: handler run complete 11728 1726882198.04240: attempt loop complete, returning result 11728 1726882198.04244: _execute() done 11728 1726882198.04247: dumping result to json 11728 1726882198.04250: done dumping result, returning 11728 1726882198.04252: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [12673a56-9f93-5c28-a762-0000000004e8] 11728 1726882198.04254: sending task result for task 12673a56-9f93-5c28-a762-0000000004e8 11728 1726882198.04323: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e8 11728 1726882198.04326: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882198.04391: no more pending results, returning what we have 11728 1726882198.04399: results queue empty 11728 1726882198.04401: checking for any_errors_fatal 11728 1726882198.04407: done checking for any_errors_fatal 11728 1726882198.04408: checking for max_fail_percentage 11728 1726882198.04411: done checking for max_fail_percentage 11728 1726882198.04412: checking to see if all hosts have failed and the running result is not ok 11728 1726882198.04413: done checking to see if all hosts have failed 11728 1726882198.04413: getting the remaining hosts for this loop 11728 1726882198.04415: done getting the remaining hosts for this loop 11728 1726882198.04419: getting the next task for host managed_node3 11728 1726882198.04428: done getting next task for host managed_node3 11728 1726882198.04431: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11728 1726882198.04436: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882198.04440: getting variables 11728 1726882198.04441: in VariableManager get_vars() 11728 1726882198.04477: Calling all_inventory to load vars for managed_node3 11728 1726882198.04481: Calling groups_inventory to load vars for managed_node3 11728 1726882198.04485: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882198.04601: Calling all_plugins_play to load vars for managed_node3 11728 1726882198.04612: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882198.04617: Calling groups_plugins_play to load vars for managed_node3 11728 1726882198.06205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882198.07878: done with get_vars() 11728 1726882198.07902: done getting variables 11728 1726882198.07974: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882198.08107: variable 'profile' from source: include params 11728 1726882198.08110: variable 'bond_port_profile' from source: include params 11728 1726882198.08177: variable 'bond_port_profile' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:58 -0400 (0:00:00.068) 0:00:22.934 ****** 11728 1726882198.08214: entering _queue_task() for managed_node3/assert 11728 1726882198.08639: worker is 1 (out of 1 available) 11728 1726882198.08650: exiting _queue_task() for managed_node3/assert 11728 1726882198.08661: done queuing things up, now waiting for results queue to drain 11728 1726882198.08663: waiting for pending results... 11728 1726882198.08906: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11728 1726882198.09041: in run() - task 12673a56-9f93-5c28-a762-0000000004e9 11728 1726882198.09045: variable 'ansible_search_path' from source: unknown 11728 1726882198.09048: variable 'ansible_search_path' from source: unknown 11728 1726882198.09087: calling self._execute() 11728 1726882198.09192: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.09218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.09222: variable 'omit' from source: magic vars 11728 1726882198.09609: variable 'ansible_distribution_major_version' from source: facts 11728 1726882198.09651: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882198.09655: variable 'omit' from source: magic vars 11728 1726882198.09711: variable 'omit' from source: magic vars 11728 1726882198.09820: variable 'profile' from source: include params 11728 1726882198.09830: variable 'bond_port_profile' from source: include params 11728 1726882198.09911: variable 'bond_port_profile' from source: include params 11728 1726882198.09925: variable 'omit' from source: magic vars 11728 1726882198.09967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882198.10087: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882198.10091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882198.10096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.10099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.10117: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882198.10132: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.10139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.10248: Set connection var ansible_connection to ssh 11728 1726882198.10263: Set connection var ansible_shell_executable to /bin/sh 11728 1726882198.10272: Set connection var ansible_timeout to 10 11728 1726882198.10279: Set connection var ansible_shell_type to sh 11728 1726882198.10291: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882198.10310: Set connection var ansible_pipelining to False 11728 1726882198.10343: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.10400: variable 'ansible_connection' from source: unknown 11728 1726882198.10403: variable 'ansible_module_compression' from source: unknown 11728 1726882198.10406: variable 'ansible_shell_type' from source: unknown 11728 1726882198.10408: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.10410: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.10412: variable 'ansible_pipelining' from source: unknown 11728 1726882198.10415: variable 'ansible_timeout' from source: unknown 11728 1726882198.10417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.10563: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882198.10568: variable 'omit' from source: magic vars 11728 1726882198.10599: starting attempt loop 11728 1726882198.10602: running the handler 11728 1726882198.10708: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11728 1726882198.10718: Evaluated conditional (lsr_net_profile_ansible_managed): True 11728 1726882198.10727: handler run complete 11728 1726882198.10779: attempt loop complete, returning result 11728 1726882198.10782: _execute() done 11728 1726882198.10784: dumping result to json 11728 1726882198.10787: done dumping result, returning 11728 1726882198.10789: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12673a56-9f93-5c28-a762-0000000004e9] 11728 1726882198.10791: sending task result for task 12673a56-9f93-5c28-a762-0000000004e9 11728 1726882198.11131: done sending task result for task 12673a56-9f93-5c28-a762-0000000004e9 11728 1726882198.11134: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882198.11183: no more pending results, returning what we have 11728 1726882198.11187: results queue empty 11728 1726882198.11187: checking for any_errors_fatal 11728 1726882198.11197: done checking for any_errors_fatal 11728 1726882198.11198: checking for max_fail_percentage 11728 1726882198.11200: done checking for max_fail_percentage 11728 1726882198.11201: checking to see if all hosts have failed and the running result is not ok 11728 1726882198.11202: done checking to see if all hosts have failed 11728 1726882198.11203: getting the remaining hosts for this loop 11728 1726882198.11204: done getting the remaining hosts for this loop 11728 1726882198.11207: getting the next task for host managed_node3 11728 1726882198.11214: done getting next task for host managed_node3 11728 1726882198.11216: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11728 1726882198.11221: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882198.11225: getting variables 11728 1726882198.11226: in VariableManager get_vars() 11728 1726882198.11260: Calling all_inventory to load vars for managed_node3 11728 1726882198.11263: Calling groups_inventory to load vars for managed_node3 11728 1726882198.11267: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882198.11276: Calling all_plugins_play to load vars for managed_node3 11728 1726882198.11279: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882198.11282: Calling groups_plugins_play to load vars for managed_node3 11728 1726882198.12989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882198.14641: done with get_vars() 11728 1726882198.14666: done getting variables 11728 1726882198.14739: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882198.14869: variable 'profile' from source: include params 11728 1726882198.14872: variable 'bond_port_profile' from source: include params 11728 1726882198.14935: variable 'bond_port_profile' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:58 -0400 (0:00:00.067) 0:00:23.002 ****** 11728 1726882198.14974: entering _queue_task() for managed_node3/assert 11728 1726882198.15340: worker is 1 (out of 1 available) 11728 1726882198.15354: exiting _queue_task() for managed_node3/assert 11728 1726882198.15364: done queuing things up, now waiting for results queue to drain 11728 1726882198.15366: waiting for pending results... 11728 1726882198.15654: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 11728 1726882198.15790: in run() - task 12673a56-9f93-5c28-a762-0000000004ea 11728 1726882198.15824: variable 'ansible_search_path' from source: unknown 11728 1726882198.15831: variable 'ansible_search_path' from source: unknown 11728 1726882198.15866: calling self._execute() 11728 1726882198.16000: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.16005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.16007: variable 'omit' from source: magic vars 11728 1726882198.16372: variable 'ansible_distribution_major_version' from source: facts 11728 1726882198.16387: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882198.16400: variable 'omit' from source: magic vars 11728 1726882198.16447: variable 'omit' from source: magic vars 11728 1726882198.16556: variable 'profile' from source: include params 11728 1726882198.16583: variable 'bond_port_profile' from source: include params 11728 1726882198.16642: variable 'bond_port_profile' from source: include params 11728 1726882198.16697: variable 'omit' from source: magic vars 11728 1726882198.16722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882198.16762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882198.16786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882198.16914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.16917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.16920: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882198.16923: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.16925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.16983: Set connection var ansible_connection to ssh 11728 1726882198.17004: Set connection var ansible_shell_executable to /bin/sh 11728 1726882198.17022: Set connection var ansible_timeout to 10 11728 1726882198.17029: Set connection var ansible_shell_type to sh 11728 1726882198.17041: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882198.17049: Set connection var ansible_pipelining to False 11728 1726882198.17074: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.17081: variable 'ansible_connection' from source: unknown 11728 1726882198.17087: variable 'ansible_module_compression' from source: unknown 11728 1726882198.17096: variable 'ansible_shell_type' from source: unknown 11728 1726882198.17104: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.17111: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.17118: variable 'ansible_pipelining' from source: unknown 11728 1726882198.17134: variable 'ansible_timeout' from source: unknown 11728 1726882198.17141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.17296: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882198.17315: variable 'omit' from source: magic vars 11728 1726882198.17349: starting attempt loop 11728 1726882198.17352: running the handler 11728 1726882198.17454: variable 'lsr_net_profile_fingerprint' from source: set_fact 11728 1726882198.17566: Evaluated conditional (lsr_net_profile_fingerprint): True 11728 1726882198.17569: handler run complete 11728 1726882198.17571: attempt loop complete, returning result 11728 1726882198.17572: _execute() done 11728 1726882198.17575: dumping result to json 11728 1726882198.17577: done dumping result, returning 11728 1726882198.17579: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [12673a56-9f93-5c28-a762-0000000004ea] 11728 1726882198.17581: sending task result for task 12673a56-9f93-5c28-a762-0000000004ea 11728 1726882198.17649: done sending task result for task 12673a56-9f93-5c28-a762-0000000004ea 11728 1726882198.17652: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882198.17709: no more pending results, returning what we have 11728 1726882198.17713: results queue empty 11728 1726882198.17714: checking for any_errors_fatal 11728 1726882198.17721: done checking for any_errors_fatal 11728 1726882198.17722: checking for max_fail_percentage 11728 1726882198.17724: done checking for max_fail_percentage 11728 1726882198.17725: checking to see if all hosts have failed and the running result is not ok 11728 1726882198.17726: done checking to see if all hosts have failed 11728 1726882198.17727: getting the remaining hosts for this loop 11728 1726882198.17729: done getting the remaining hosts for this loop 11728 1726882198.17732: getting the next task for host managed_node3 11728 1726882198.17745: done getting next task for host managed_node3 11728 1726882198.17749: ^ task is: TASK: ** TEST check bond settings 11728 1726882198.17753: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882198.17758: getting variables 11728 1726882198.17760: in VariableManager get_vars() 11728 1726882198.17798: Calling all_inventory to load vars for managed_node3 11728 1726882198.17801: Calling groups_inventory to load vars for managed_node3 11728 1726882198.17805: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882198.17816: Calling all_plugins_play to load vars for managed_node3 11728 1726882198.17820: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882198.17823: Calling groups_plugins_play to load vars for managed_node3 11728 1726882198.19474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882198.22367: done with get_vars() 11728 1726882198.22403: done getting variables 11728 1726882198.22582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 21:29:58 -0400 (0:00:00.076) 0:00:23.078 ****** 11728 1726882198.22628: entering _queue_task() for managed_node3/command 11728 1726882198.23316: worker is 1 (out of 1 available) 11728 1726882198.23328: exiting _queue_task() for managed_node3/command 11728 1726882198.23596: done queuing things up, now waiting for results queue to drain 11728 1726882198.23598: waiting for pending results... 11728 1726882198.23815: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 11728 1726882198.23981: in run() - task 12673a56-9f93-5c28-a762-000000000400 11728 1726882198.24000: variable 'ansible_search_path' from source: unknown 11728 1726882198.24003: variable 'ansible_search_path' from source: unknown 11728 1726882198.24111: variable 'bond_options_to_assert' from source: play vars 11728 1726882198.24477: variable 'bond_options_to_assert' from source: play vars 11728 1726882198.24936: variable 'omit' from source: magic vars 11728 1726882198.25200: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.25205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.25345: variable 'omit' from source: magic vars 11728 1726882198.25779: variable 'ansible_distribution_major_version' from source: facts 11728 1726882198.25783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882198.26187: variable 'omit' from source: magic vars 11728 1726882198.26235: variable 'omit' from source: magic vars 11728 1726882198.26911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882198.31943: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882198.31948: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882198.32021: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882198.32201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882198.32205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882198.32266: variable 'controller_device' from source: play vars 11728 1726882198.32269: variable 'bond_opt' from source: unknown 11728 1726882198.32297: variable 'omit' from source: magic vars 11728 1726882198.32512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882198.32707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882198.32711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882198.32713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.32719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.32750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882198.32870: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.32874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.33083: Set connection var ansible_connection to ssh 11728 1726882198.33098: Set connection var ansible_shell_executable to /bin/sh 11728 1726882198.33102: Set connection var ansible_timeout to 10 11728 1726882198.33104: Set connection var ansible_shell_type to sh 11728 1726882198.33113: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882198.33125: Set connection var ansible_pipelining to False 11728 1726882198.33142: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.33144: variable 'ansible_connection' from source: unknown 11728 1726882198.33147: variable 'ansible_module_compression' from source: unknown 11728 1726882198.33149: variable 'ansible_shell_type' from source: unknown 11728 1726882198.33151: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.33155: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.33159: variable 'ansible_pipelining' from source: unknown 11728 1726882198.33162: variable 'ansible_timeout' from source: unknown 11728 1726882198.33167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.33509: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882198.33636: variable 'omit' from source: magic vars 11728 1726882198.33640: starting attempt loop 11728 1726882198.33643: running the handler 11728 1726882198.33659: _low_level_execute_command(): starting 11728 1726882198.33665: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882198.34844: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882198.34912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.34922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.34936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.34949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882198.34960: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882198.34969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.34983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882198.34990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882198.35001: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882198.35007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.35215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.35219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.35929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.37261: stdout chunk (state=3): >>>/root <<< 11728 1726882198.37302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.37308: stdout chunk (state=3): >>><<< 11728 1726882198.37319: stderr chunk (state=3): >>><<< 11728 1726882198.37342: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.37354: _low_level_execute_command(): starting 11728 1726882198.37360: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501 `" && echo ansible-tmp-1726882198.373427-12864-269792713290501="` echo /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501 `" ) && sleep 0' 11728 1726882198.39200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.39325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.39406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.41377: stdout chunk (state=3): >>>ansible-tmp-1726882198.373427-12864-269792713290501=/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501 <<< 11728 1726882198.41455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.41458: stdout chunk (state=3): >>><<< 11728 1726882198.41465: stderr chunk (state=3): >>><<< 11728 1726882198.41482: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882198.373427-12864-269792713290501=/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.41514: variable 'ansible_module_compression' from source: unknown 11728 1726882198.41602: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882198.41605: variable 'ansible_facts' from source: unknown 11728 1726882198.41976: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py 11728 1726882198.42611: Sending initial data 11728 1726882198.42613: Sent initial data (155 bytes) 11728 1726882198.43428: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.43434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.43485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.43489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882198.43507: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.43510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882198.43630: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882198.43634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.43768: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882198.43771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.43874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.43916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.45523: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882198.45643: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882198.45667: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvv6o36rq /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py <<< 11728 1726882198.45672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvv6o36rq" to remote "/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py" <<< 11728 1726882198.47714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.47718: stdout chunk (state=3): >>><<< 11728 1726882198.47720: stderr chunk (state=3): >>><<< 11728 1726882198.47800: done transferring module to remote 11728 1726882198.47803: _low_level_execute_command(): starting 11728 1726882198.47806: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/ /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py && sleep 0' 11728 1726882198.49500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.49504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.49507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.49509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.49511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882198.49513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.49615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882198.49635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.49822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.50333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.52066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.52077: stdout chunk (state=3): >>><<< 11728 1726882198.52089: stderr chunk (state=3): >>><<< 11728 1726882198.52137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.52229: _low_level_execute_command(): starting 11728 1726882198.52239: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/AnsiballZ_command.py && sleep 0' 11728 1726882198.53611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882198.53649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.53683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.53907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.53985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882198.54083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.54326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.69466: stdout chunk (state=3): >>> {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:29:58.689769", "end": "2024-09-20 21:29:58.692861", "delta": "0:00:00.003092", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882198.71079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882198.71084: stdout chunk (state=3): >>><<< 11728 1726882198.71087: stderr chunk (state=3): >>><<< 11728 1726882198.71090: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "802.3ad 4", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:29:58.689769", "end": "2024-09-20 21:29:58.692861", "delta": "0:00:00.003092", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882198.71097: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882198.71105: _low_level_execute_command(): starting 11728 1726882198.71107: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882198.373427-12864-269792713290501/ > /dev/null 2>&1 && sleep 0' 11728 1726882198.72365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.72369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.72487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.72490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.72498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.72501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.72546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882198.72683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.72782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.74751: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.74754: stdout chunk (state=3): >>><<< 11728 1726882198.74756: stderr chunk (state=3): >>><<< 11728 1726882198.74758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.74760: handler run complete 11728 1726882198.74762: Evaluated conditional (False): False 11728 1726882198.75188: variable 'bond_opt' from source: unknown 11728 1726882198.75191: variable 'result' from source: unknown 11728 1726882198.75197: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882198.75200: attempt loop complete, returning result 11728 1726882198.75224: variable 'bond_opt' from source: unknown 11728 1726882198.75409: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': '802.3ad'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "802.3ad" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003092", "end": "2024-09-20 21:29:58.692861", "rc": 0, "start": "2024-09-20 21:29:58.689769" } STDOUT: 802.3ad 4 11728 1726882198.75907: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.75911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.75913: variable 'omit' from source: magic vars 11728 1726882198.76200: variable 'ansible_distribution_major_version' from source: facts 11728 1726882198.76203: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882198.76206: variable 'omit' from source: magic vars 11728 1726882198.76235: variable 'omit' from source: magic vars 11728 1726882198.76640: variable 'controller_device' from source: play vars 11728 1726882198.76651: variable 'bond_opt' from source: unknown 11728 1726882198.76686: variable 'omit' from source: magic vars 11728 1726882198.76889: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882198.76896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.76899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882198.76902: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882198.76904: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.76906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.77046: Set connection var ansible_connection to ssh 11728 1726882198.77061: Set connection var ansible_shell_executable to /bin/sh 11728 1726882198.77117: Set connection var ansible_timeout to 10 11728 1726882198.77125: Set connection var ansible_shell_type to sh 11728 1726882198.77138: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882198.77147: Set connection var ansible_pipelining to False 11728 1726882198.77399: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.77402: variable 'ansible_connection' from source: unknown 11728 1726882198.77405: variable 'ansible_module_compression' from source: unknown 11728 1726882198.77407: variable 'ansible_shell_type' from source: unknown 11728 1726882198.77409: variable 'ansible_shell_executable' from source: unknown 11728 1726882198.77411: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882198.77413: variable 'ansible_pipelining' from source: unknown 11728 1726882198.77415: variable 'ansible_timeout' from source: unknown 11728 1726882198.77417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882198.77485: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882198.77554: variable 'omit' from source: magic vars 11728 1726882198.77741: starting attempt loop 11728 1726882198.77744: running the handler 11728 1726882198.77748: _low_level_execute_command(): starting 11728 1726882198.77751: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882198.79496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882198.79501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.79505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.79508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.79517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.79520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.79523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882198.79805: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.79824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.80027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.81648: stdout chunk (state=3): >>>/root <<< 11728 1726882198.81778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.81900: stderr chunk (state=3): >>><<< 11728 1726882198.81917: stdout chunk (state=3): >>><<< 11728 1726882198.81932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.81946: _low_level_execute_command(): starting 11728 1726882198.82062: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188 `" && echo ansible-tmp-1726882198.819377-12864-68083470796188="` echo /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188 `" ) && sleep 0' 11728 1726882198.83198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882198.83406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.83588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.85521: stdout chunk (state=3): >>>ansible-tmp-1726882198.819377-12864-68083470796188=/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188 <<< 11728 1726882198.85626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.85636: stdout chunk (state=3): >>><<< 11728 1726882198.85647: stderr chunk (state=3): >>><<< 11728 1726882198.85671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882198.819377-12864-68083470796188=/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.85902: variable 'ansible_module_compression' from source: unknown 11728 1726882198.85906: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882198.85909: variable 'ansible_facts' from source: unknown 11728 1726882198.86090: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py 11728 1726882198.86485: Sending initial data 11728 1726882198.86488: Sent initial data (154 bytes) 11728 1726882198.87811: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.87815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.87818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.87821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882198.88170: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.88260: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.88333: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.89917: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882198.89923: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882198.90001: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882198.90022: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmprao0r3ae /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py <<< 11728 1726882198.90028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py" <<< 11728 1726882198.90211: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmprao0r3ae" to remote "/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py" <<< 11728 1726882198.92654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.92658: stdout chunk (state=3): >>><<< 11728 1726882198.92660: stderr chunk (state=3): >>><<< 11728 1726882198.92663: done transferring module to remote 11728 1726882198.92665: _low_level_execute_command(): starting 11728 1726882198.92667: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/ /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py && sleep 0' 11728 1726882198.94401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.94625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882198.94643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882198.94722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882198.96585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882198.96588: stdout chunk (state=3): >>><<< 11728 1726882198.96591: stderr chunk (state=3): >>><<< 11728 1726882198.96607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882198.96613: _low_level_execute_command(): starting 11728 1726882198.96816: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/AnsiballZ_command.py && sleep 0' 11728 1726882198.97958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.97962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882198.98005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882198.98009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882198.98011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882198.98073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882198.98083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882198.98309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.13703: stdout chunk (state=3): >>> {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 21:29:59.132049", "end": "2024-09-20 21:29:59.135289", "delta": "0:00:00.003240", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882199.15324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882199.15421: stderr chunk (state=3): >>><<< 11728 1726882199.15425: stdout chunk (state=3): >>><<< 11728 1726882199.15444: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "65535", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio"], "start": "2024-09-20 21:29:59.132049", "end": "2024-09-20 21:29:59.135289", "delta": "0:00:00.003240", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882199.15474: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_sys_prio', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882199.15481: _low_level_execute_command(): starting 11728 1726882199.15486: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882198.819377-12864-68083470796188/ > /dev/null 2>&1 && sleep 0' 11728 1726882199.16959: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882199.16962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882199.17449: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.17467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.17731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.19523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882199.19612: stderr chunk (state=3): >>><<< 11728 1726882199.19615: stdout chunk (state=3): >>><<< 11728 1726882199.19634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882199.19639: handler run complete 11728 1726882199.19682: Evaluated conditional (False): False 11728 1726882199.20035: variable 'bond_opt' from source: unknown 11728 1726882199.20043: variable 'result' from source: unknown 11728 1726882199.20099: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882199.20103: attempt loop complete, returning result 11728 1726882199.20105: variable 'bond_opt' from source: unknown 11728 1726882199.20153: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_sys_prio', 'value': '65535'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_sys_prio", "value": "65535" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_sys_prio" ], "delta": "0:00:00.003240", "end": "2024-09-20 21:29:59.135289", "rc": 0, "start": "2024-09-20 21:29:59.132049" } STDOUT: 65535 11728 1726882199.20467: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882199.20475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882199.20485: variable 'omit' from source: magic vars 11728 1726882199.20852: variable 'ansible_distribution_major_version' from source: facts 11728 1726882199.20856: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882199.20858: variable 'omit' from source: magic vars 11728 1726882199.20908: variable 'omit' from source: magic vars 11728 1726882199.21199: variable 'controller_device' from source: play vars 11728 1726882199.21291: variable 'bond_opt' from source: unknown 11728 1726882199.21315: variable 'omit' from source: magic vars 11728 1726882199.21335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882199.21420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882199.21423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882199.21426: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882199.21431: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882199.21434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882199.21553: Set connection var ansible_connection to ssh 11728 1726882199.21562: Set connection var ansible_shell_executable to /bin/sh 11728 1726882199.21567: Set connection var ansible_timeout to 10 11728 1726882199.21570: Set connection var ansible_shell_type to sh 11728 1726882199.21577: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882199.21582: Set connection var ansible_pipelining to False 11728 1726882199.21728: variable 'ansible_shell_executable' from source: unknown 11728 1726882199.21732: variable 'ansible_connection' from source: unknown 11728 1726882199.21734: variable 'ansible_module_compression' from source: unknown 11728 1726882199.21745: variable 'ansible_shell_type' from source: unknown 11728 1726882199.21748: variable 'ansible_shell_executable' from source: unknown 11728 1726882199.21750: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882199.21854: variable 'ansible_pipelining' from source: unknown 11728 1726882199.21857: variable 'ansible_timeout' from source: unknown 11728 1726882199.21860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882199.21942: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882199.21950: variable 'omit' from source: magic vars 11728 1726882199.21953: starting attempt loop 11728 1726882199.21961: running the handler 11728 1726882199.21963: _low_level_execute_command(): starting 11728 1726882199.21966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882199.23590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.23675: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.23697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.23766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.25402: stdout chunk (state=3): >>>/root <<< 11728 1726882199.25661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882199.25664: stdout chunk (state=3): >>><<< 11728 1726882199.25671: stderr chunk (state=3): >>><<< 11728 1726882199.25689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882199.25702: _low_level_execute_command(): starting 11728 1726882199.25708: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955 `" && echo ansible-tmp-1726882199.2568893-12864-217796615759955="` echo /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955 `" ) && sleep 0' 11728 1726882199.26970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882199.26978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882199.27007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.27018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882199.27028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882199.27034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882199.27041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882199.27050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882199.27061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882199.27068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882199.27486: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.27600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.27609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.29444: stdout chunk (state=3): >>>ansible-tmp-1726882199.2568893-12864-217796615759955=/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955 <<< 11728 1726882199.29595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882199.29601: stdout chunk (state=3): >>><<< 11728 1726882199.29609: stderr chunk (state=3): >>><<< 11728 1726882199.29626: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882199.2568893-12864-217796615759955=/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882199.29652: variable 'ansible_module_compression' from source: unknown 11728 1726882199.29695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882199.29710: variable 'ansible_facts' from source: unknown 11728 1726882199.29906: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py 11728 1726882199.30246: Sending initial data 11728 1726882199.30290: Sent initial data (156 bytes) 11728 1726882199.31510: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.31652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.31998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.32001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.33578: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882199.33623: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882199.33672: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzqdbea1m /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py <<< 11728 1726882199.33681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py" <<< 11728 1726882199.33720: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpzqdbea1m" to remote "/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py" <<< 11728 1726882199.35124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882199.35177: stderr chunk (state=3): >>><<< 11728 1726882199.35187: stdout chunk (state=3): >>><<< 11728 1726882199.35255: done transferring module to remote 11728 1726882199.35308: _low_level_execute_command(): starting 11728 1726882199.35319: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/ /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py && sleep 0' 11728 1726882199.36610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882199.36635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882199.36709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.36874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.36891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.37174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882199.38935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882199.38946: stdout chunk (state=3): >>><<< 11728 1726882199.38958: stderr chunk (state=3): >>><<< 11728 1726882199.39011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882199.39020: _low_level_execute_command(): starting 11728 1726882199.39028: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/AnsiballZ_command.py && sleep 0' 11728 1726882199.40255: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882199.40259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.40280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882199.40470: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882199.40528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882199.40545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882199.40573: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882199.40658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.56220: stdout chunk (state=3): >>> {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 21:29:59.554530", "end": "2024-09-20 21:30:00.558743", "delta": "0:00:01.004213", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882200.57819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882200.57822: stdout chunk (state=3): >>><<< 11728 1726882200.57824: stderr chunk (state=3): >>><<< 11728 1726882200.57841: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "00:00:5e:00:53:5d", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_actor_system"], "start": "2024-09-20 21:29:59.554530", "end": "2024-09-20 21:30:00.558743", "delta": "0:00:01.004213", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_actor_system", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882200.57872: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_actor_system', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882200.57883: _low_level_execute_command(): starting 11728 1726882200.57891: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882199.2568893-12864-217796615759955/ > /dev/null 2>&1 && sleep 0' 11728 1726882200.59190: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882200.59279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.59408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.59428: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.59449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.59525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.61454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.61478: stderr chunk (state=3): >>><<< 11728 1726882200.61488: stdout chunk (state=3): >>><<< 11728 1726882200.61510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882200.61513: handler run complete 11728 1726882200.61535: Evaluated conditional (False): False 11728 1726882200.61921: variable 'bond_opt' from source: unknown 11728 1726882200.61928: variable 'result' from source: unknown 11728 1726882200.61941: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882200.62000: attempt loop complete, returning result 11728 1726882200.62003: variable 'bond_opt' from source: unknown 11728 1726882200.62154: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_actor_system', 'value': '00:00:5e:00:53:5d'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_actor_system", "value": "00:00:5e:00:53:5d" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_actor_system" ], "delta": "0:00:01.004213", "end": "2024-09-20 21:30:00.558743", "rc": 0, "start": "2024-09-20 21:29:59.554530" } STDOUT: 00:00:5e:00:53:5d 11728 1726882200.62476: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.62481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882200.62483: variable 'omit' from source: magic vars 11728 1726882200.62833: variable 'ansible_distribution_major_version' from source: facts 11728 1726882200.62838: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882200.62842: variable 'omit' from source: magic vars 11728 1726882200.62975: variable 'omit' from source: magic vars 11728 1726882200.63255: variable 'controller_device' from source: play vars 11728 1726882200.63258: variable 'bond_opt' from source: unknown 11728 1726882200.63274: variable 'omit' from source: magic vars 11728 1726882200.63423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882200.63435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882200.63472: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882200.63475: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882200.63477: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.63479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882200.63689: Set connection var ansible_connection to ssh 11728 1726882200.63696: Set connection var ansible_shell_executable to /bin/sh 11728 1726882200.63699: Set connection var ansible_timeout to 10 11728 1726882200.63720: Set connection var ansible_shell_type to sh 11728 1726882200.63730: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882200.63737: Set connection var ansible_pipelining to False 11728 1726882200.63803: variable 'ansible_shell_executable' from source: unknown 11728 1726882200.63826: variable 'ansible_connection' from source: unknown 11728 1726882200.63834: variable 'ansible_module_compression' from source: unknown 11728 1726882200.63918: variable 'ansible_shell_type' from source: unknown 11728 1726882200.63922: variable 'ansible_shell_executable' from source: unknown 11728 1726882200.63924: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.63926: variable 'ansible_pipelining' from source: unknown 11728 1726882200.63928: variable 'ansible_timeout' from source: unknown 11728 1726882200.63930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882200.63983: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882200.64001: variable 'omit' from source: magic vars 11728 1726882200.64010: starting attempt loop 11728 1726882200.64017: running the handler 11728 1726882200.64032: _low_level_execute_command(): starting 11728 1726882200.64041: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882200.64690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.64736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.64750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.64954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.65010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.66691: stdout chunk (state=3): >>>/root <<< 11728 1726882200.66811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.66865: stderr chunk (state=3): >>><<< 11728 1726882200.66966: stdout chunk (state=3): >>><<< 11728 1726882200.66976: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882200.66979: _low_level_execute_command(): starting 11728 1726882200.66982: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557 `" && echo ansible-tmp-1726882200.6688461-12864-17746171855557="` echo /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557 `" ) && sleep 0' 11728 1726882200.67527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882200.67546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882200.67562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882200.67580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882200.67652: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882200.67656: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.67707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.67733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.67754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.67822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.70044: stdout chunk (state=3): >>>ansible-tmp-1726882200.6688461-12864-17746171855557=/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557 <<< 11728 1726882200.70253: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.70257: stdout chunk (state=3): >>><<< 11728 1726882200.70259: stderr chunk (state=3): >>><<< 11728 1726882200.70262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882200.6688461-12864-17746171855557=/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882200.70264: variable 'ansible_module_compression' from source: unknown 11728 1726882200.70267: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882200.70269: variable 'ansible_facts' from source: unknown 11728 1726882200.70401: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py 11728 1726882200.70623: Sending initial data 11728 1726882200.70637: Sent initial data (155 bytes) 11728 1726882200.71155: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882200.71171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882200.71186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882200.71300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.71317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.71333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.71411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.72989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882200.73026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882200.73072: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt99ildxm /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py <<< 11728 1726882200.73075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py" <<< 11728 1726882200.73132: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpt99ildxm" to remote "/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py" <<< 11728 1726882200.75000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.75004: stdout chunk (state=3): >>><<< 11728 1726882200.75007: stderr chunk (state=3): >>><<< 11728 1726882200.75009: done transferring module to remote 11728 1726882200.75011: _low_level_execute_command(): starting 11728 1726882200.75013: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/ /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py && sleep 0' 11728 1726882200.75624: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882200.75641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882200.75651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882200.75665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882200.75677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882200.75685: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882200.75695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.75713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882200.75746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.75806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.75832: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.75910: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.77813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.77820: stdout chunk (state=3): >>><<< 11728 1726882200.77986: stderr chunk (state=3): >>><<< 11728 1726882200.77990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882200.77995: _low_level_execute_command(): starting 11728 1726882200.77998: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/AnsiballZ_command.py && sleep 0' 11728 1726882200.78600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.78604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.78684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.94129: stdout chunk (state=3): >>> {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 21:30:00.935013", "end": "2024-09-20 21:30:00.938131", "delta": "0:00:00.003118", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882200.95702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882200.95706: stdout chunk (state=3): >>><<< 11728 1726882200.95709: stderr chunk (state=3): >>><<< 11728 1726882200.95711: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "stable 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_select"], "start": "2024-09-20 21:30:00.935013", "end": "2024-09-20 21:30:00.938131", "delta": "0:00:00.003118", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_select", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882200.95713: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_select', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882200.95716: _low_level_execute_command(): starting 11728 1726882200.95719: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882200.6688461-12864-17746171855557/ > /dev/null 2>&1 && sleep 0' 11728 1726882200.96313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882200.96321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882200.96332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882200.96347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882200.96371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882200.96378: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882200.96388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.96407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882200.96473: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882200.96508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882200.96523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882200.96544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882200.96623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882200.98525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882200.98529: stdout chunk (state=3): >>><<< 11728 1726882200.98531: stderr chunk (state=3): >>><<< 11728 1726882200.98557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882200.98701: handler run complete 11728 1726882200.98705: Evaluated conditional (False): False 11728 1726882200.98746: variable 'bond_opt' from source: unknown 11728 1726882200.98758: variable 'result' from source: unknown 11728 1726882200.98776: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882200.98792: attempt loop complete, returning result 11728 1726882200.98830: variable 'bond_opt' from source: unknown 11728 1726882200.98904: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_select', 'value': 'stable'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_select", "value": "stable" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_select" ], "delta": "0:00:00.003118", "end": "2024-09-20 21:30:00.938131", "rc": 0, "start": "2024-09-20 21:30:00.935013" } STDOUT: stable 0 11728 1726882200.99209: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.99213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882200.99215: variable 'omit' from source: magic vars 11728 1726882200.99353: variable 'ansible_distribution_major_version' from source: facts 11728 1726882200.99364: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882200.99372: variable 'omit' from source: magic vars 11728 1726882200.99391: variable 'omit' from source: magic vars 11728 1726882200.99575: variable 'controller_device' from source: play vars 11728 1726882200.99643: variable 'bond_opt' from source: unknown 11728 1726882200.99646: variable 'omit' from source: magic vars 11728 1726882200.99649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882200.99654: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882200.99666: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882200.99685: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882200.99697: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.99706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882200.99789: Set connection var ansible_connection to ssh 11728 1726882200.99809: Set connection var ansible_shell_executable to /bin/sh 11728 1726882200.99820: Set connection var ansible_timeout to 10 11728 1726882200.99826: Set connection var ansible_shell_type to sh 11728 1726882200.99860: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882200.99863: Set connection var ansible_pipelining to False 11728 1726882200.99877: variable 'ansible_shell_executable' from source: unknown 11728 1726882200.99884: variable 'ansible_connection' from source: unknown 11728 1726882200.99891: variable 'ansible_module_compression' from source: unknown 11728 1726882200.99970: variable 'ansible_shell_type' from source: unknown 11728 1726882200.99973: variable 'ansible_shell_executable' from source: unknown 11728 1726882200.99975: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882200.99977: variable 'ansible_pipelining' from source: unknown 11728 1726882200.99980: variable 'ansible_timeout' from source: unknown 11728 1726882200.99982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.00038: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882201.00053: variable 'omit' from source: magic vars 11728 1726882201.00061: starting attempt loop 11728 1726882201.00067: running the handler 11728 1726882201.00085: _low_level_execute_command(): starting 11728 1726882201.00098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882201.00767: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.00783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.00802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.00824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.00849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.00862: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.00960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.00976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.00992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.01021: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.01109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.02829: stdout chunk (state=3): >>>/root <<< 11728 1726882201.02967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.03003: stdout chunk (state=3): >>><<< 11728 1726882201.03007: stderr chunk (state=3): >>><<< 11728 1726882201.03023: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.03110: _low_level_execute_command(): starting 11728 1726882201.03114: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695 `" && echo ansible-tmp-1726882201.0302882-12864-32995671779695="` echo /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695 `" ) && sleep 0' 11728 1726882201.03681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.03701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.03716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.03764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.03837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.03861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.03900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.03978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.05928: stdout chunk (state=3): >>>ansible-tmp-1726882201.0302882-12864-32995671779695=/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695 <<< 11728 1726882201.06097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.06101: stdout chunk (state=3): >>><<< 11728 1726882201.06103: stderr chunk (state=3): >>><<< 11728 1726882201.06122: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882201.0302882-12864-32995671779695=/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.06150: variable 'ansible_module_compression' from source: unknown 11728 1726882201.06203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882201.06277: variable 'ansible_facts' from source: unknown 11728 1726882201.06306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py 11728 1726882201.06437: Sending initial data 11728 1726882201.06553: Sent initial data (155 bytes) 11728 1726882201.07331: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.07346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882201.07357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.07442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.07464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.07480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.07572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.09301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882201.09381: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882201.09438: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmplfxubpp3 /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py <<< 11728 1726882201.09442: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py" <<< 11728 1726882201.09496: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmplfxubpp3" to remote "/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py" <<< 11728 1726882201.10363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.10366: stdout chunk (state=3): >>><<< 11728 1726882201.10368: stderr chunk (state=3): >>><<< 11728 1726882201.10374: done transferring module to remote 11728 1726882201.10385: _low_level_execute_command(): starting 11728 1726882201.10392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/ /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py && sleep 0' 11728 1726882201.11005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.11029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.11045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.11134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.11167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.11189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.11251: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.11282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.13173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.13177: stdout chunk (state=3): >>><<< 11728 1726882201.13179: stderr chunk (state=3): >>><<< 11728 1726882201.13281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.13284: _low_level_execute_command(): starting 11728 1726882201.13287: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/AnsiballZ_command.py && sleep 0' 11728 1726882201.13868: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.13884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.13905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.13966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.14027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.14044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.14083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.14181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.29849: stdout chunk (state=3): >>> {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 21:30:01.293644", "end": "2024-09-20 21:30:01.296876", "delta": "0:00:00.003232", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882201.31500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882201.31505: stdout chunk (state=3): >>><<< 11728 1726882201.31507: stderr chunk (state=3): >>><<< 11728 1726882201.31510: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1023", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key"], "start": "2024-09-20 21:30:01.293644", "end": "2024-09-20 21:30:01.296876", "delta": "0:00:00.003232", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/ad_user_port_key", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882201.31516: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/ad_user_port_key', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882201.31519: _low_level_execute_command(): starting 11728 1726882201.31521: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882201.0302882-12864-32995671779695/ > /dev/null 2>&1 && sleep 0' 11728 1726882201.32128: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.32134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.32154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.32166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.32177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.32183: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.32192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.32217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882201.32223: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882201.32229: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882201.32237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.32245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.32265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.32312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.32315: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882201.32354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.32409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.32484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.34600: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.34604: stderr chunk (state=3): >>><<< 11728 1726882201.34607: stdout chunk (state=3): >>><<< 11728 1726882201.34610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.34612: handler run complete 11728 1726882201.34614: Evaluated conditional (False): False 11728 1726882201.34616: variable 'bond_opt' from source: unknown 11728 1726882201.34618: variable 'result' from source: unknown 11728 1726882201.34620: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882201.34622: attempt loop complete, returning result 11728 1726882201.34651: variable 'bond_opt' from source: unknown 11728 1726882201.34718: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'ad_user_port_key', 'value': '1023'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "ad_user_port_key", "value": "1023" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/ad_user_port_key" ], "delta": "0:00:00.003232", "end": "2024-09-20 21:30:01.296876", "rc": 0, "start": "2024-09-20 21:30:01.293644" } STDOUT: 1023 11728 1726882201.34852: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.34856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.35081: variable 'omit' from source: magic vars 11728 1726882201.35084: variable 'ansible_distribution_major_version' from source: facts 11728 1726882201.35087: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882201.35089: variable 'omit' from source: magic vars 11728 1726882201.35105: variable 'omit' from source: magic vars 11728 1726882201.35486: variable 'controller_device' from source: play vars 11728 1726882201.35489: variable 'bond_opt' from source: unknown 11728 1726882201.35520: variable 'omit' from source: magic vars 11728 1726882201.35542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882201.35550: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882201.35557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882201.35572: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882201.35575: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.35577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.35775: Set connection var ansible_connection to ssh 11728 1726882201.35784: Set connection var ansible_shell_executable to /bin/sh 11728 1726882201.35789: Set connection var ansible_timeout to 10 11728 1726882201.35792: Set connection var ansible_shell_type to sh 11728 1726882201.35858: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882201.35861: Set connection var ansible_pipelining to False 11728 1726882201.35864: variable 'ansible_shell_executable' from source: unknown 11728 1726882201.35868: variable 'ansible_connection' from source: unknown 11728 1726882201.35871: variable 'ansible_module_compression' from source: unknown 11728 1726882201.35873: variable 'ansible_shell_type' from source: unknown 11728 1726882201.35875: variable 'ansible_shell_executable' from source: unknown 11728 1726882201.35967: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.35975: variable 'ansible_pipelining' from source: unknown 11728 1726882201.35978: variable 'ansible_timeout' from source: unknown 11728 1726882201.35980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.36165: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882201.36174: variable 'omit' from source: magic vars 11728 1726882201.36176: starting attempt loop 11728 1726882201.36180: running the handler 11728 1726882201.36186: _low_level_execute_command(): starting 11728 1726882201.36191: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882201.37116: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.37126: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.37142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.37212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.37255: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.37289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.37292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.37373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.39552: stdout chunk (state=3): >>>/root <<< 11728 1726882201.39557: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.39559: stdout chunk (state=3): >>><<< 11728 1726882201.39561: stderr chunk (state=3): >>><<< 11728 1726882201.39563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.39565: _low_level_execute_command(): starting 11728 1726882201.39567: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103 `" && echo ansible-tmp-1726882201.394918-12864-92946772883103="` echo /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103 `" ) && sleep 0' 11728 1726882201.40404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.40408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.40411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.40419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.40422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.40425: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.40539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.40726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.40781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.42742: stdout chunk (state=3): >>>ansible-tmp-1726882201.394918-12864-92946772883103=/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103 <<< 11728 1726882201.42886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.42890: stdout chunk (state=3): >>><<< 11728 1726882201.42899: stderr chunk (state=3): >>><<< 11728 1726882201.42919: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882201.394918-12864-92946772883103=/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.42946: variable 'ansible_module_compression' from source: unknown 11728 1726882201.42980: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882201.43003: variable 'ansible_facts' from source: unknown 11728 1726882201.43065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py 11728 1726882201.43536: Sending initial data 11728 1726882201.43540: Sent initial data (154 bytes) 11728 1726882201.44755: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.44759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.44762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.44764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.44766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.44768: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.44770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.44773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.44775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.44834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.44973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.46625: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882201.46668: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882201.46722: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp4v8ywqgi /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py <<< 11728 1726882201.46725: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py" <<< 11728 1726882201.46768: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp4v8ywqgi" to remote "/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py" <<< 11728 1726882201.48577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.48580: stdout chunk (state=3): >>><<< 11728 1726882201.48582: stderr chunk (state=3): >>><<< 11728 1726882201.48624: done transferring module to remote 11728 1726882201.48631: _low_level_execute_command(): starting 11728 1726882201.48643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/ /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py && sleep 0' 11728 1726882201.49834: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.49843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.49854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.49889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.49892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.49897: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.49899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.50000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882201.50004: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882201.50006: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882201.50008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.50010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.50012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.50014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.50016: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882201.50018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.50275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.50281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.50353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.52175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.52179: stdout chunk (state=3): >>><<< 11728 1726882201.52181: stderr chunk (state=3): >>><<< 11728 1726882201.52205: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.52208: _low_level_execute_command(): starting 11728 1726882201.52211: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/AnsiballZ_command.py && sleep 0' 11728 1726882201.53160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.53198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.53210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.53226: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882201.53322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.53335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.53349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.53388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.53436: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.69274: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 21:30:01.687723", "end": "2024-09-20 21:30:01.690983", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882201.71099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882201.71105: stdout chunk (state=3): >>><<< 11728 1726882201.71107: stderr chunk (state=3): >>><<< 11728 1726882201.71110: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/all_slaves_active"], "start": "2024-09-20 21:30:01.687723", "end": "2024-09-20 21:30:01.690983", "delta": "0:00:00.003260", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/all_slaves_active", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882201.71204: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/all_slaves_active', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882201.71207: _low_level_execute_command(): starting 11728 1726882201.71210: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882201.394918-12864-92946772883103/ > /dev/null 2>&1 && sleep 0' 11728 1726882201.72405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.72436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.72609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.72767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.72771: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.72799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.72881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.74807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.74826: stdout chunk (state=3): >>><<< 11728 1726882201.75034: stderr chunk (state=3): >>><<< 11728 1726882201.75038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.75044: handler run complete 11728 1726882201.75047: Evaluated conditional (False): False 11728 1726882201.75276: variable 'bond_opt' from source: unknown 11728 1726882201.75308: variable 'result' from source: unknown 11728 1726882201.75375: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882201.75388: attempt loop complete, returning result 11728 1726882201.75412: variable 'bond_opt' from source: unknown 11728 1726882201.75687: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'all_slaves_active', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "all_slaves_active", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/all_slaves_active" ], "delta": "0:00:00.003260", "end": "2024-09-20 21:30:01.690983", "rc": 0, "start": "2024-09-20 21:30:01.687723" } STDOUT: 1 11728 1726882201.76100: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.76103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.76106: variable 'omit' from source: magic vars 11728 1726882201.76179: variable 'ansible_distribution_major_version' from source: facts 11728 1726882201.76233: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882201.76242: variable 'omit' from source: magic vars 11728 1726882201.76261: variable 'omit' from source: magic vars 11728 1726882201.76569: variable 'controller_device' from source: play vars 11728 1726882201.76685: variable 'bond_opt' from source: unknown 11728 1726882201.76711: variable 'omit' from source: magic vars 11728 1726882201.76736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882201.76750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882201.76761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882201.76781: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882201.76807: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.76999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.77003: Set connection var ansible_connection to ssh 11728 1726882201.77006: Set connection var ansible_shell_executable to /bin/sh 11728 1726882201.77008: Set connection var ansible_timeout to 10 11728 1726882201.77010: Set connection var ansible_shell_type to sh 11728 1726882201.77012: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882201.77014: Set connection var ansible_pipelining to False 11728 1726882201.77238: variable 'ansible_shell_executable' from source: unknown 11728 1726882201.77241: variable 'ansible_connection' from source: unknown 11728 1726882201.77244: variable 'ansible_module_compression' from source: unknown 11728 1726882201.77246: variable 'ansible_shell_type' from source: unknown 11728 1726882201.77248: variable 'ansible_shell_executable' from source: unknown 11728 1726882201.77250: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882201.77252: variable 'ansible_pipelining' from source: unknown 11728 1726882201.77254: variable 'ansible_timeout' from source: unknown 11728 1726882201.77256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882201.77398: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882201.77400: variable 'omit' from source: magic vars 11728 1726882201.77402: starting attempt loop 11728 1726882201.77404: running the handler 11728 1726882201.77406: _low_level_execute_command(): starting 11728 1726882201.77412: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882201.78651: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.78665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.78709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.78881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.79017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.79072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.80760: stdout chunk (state=3): >>>/root <<< 11728 1726882201.80900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.80923: stdout chunk (state=3): >>><<< 11728 1726882201.80927: stderr chunk (state=3): >>><<< 11728 1726882201.81001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.81005: _low_level_execute_command(): starting 11728 1726882201.81007: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286 `" && echo ansible-tmp-1726882201.8094811-12864-268324658247286="` echo /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286 `" ) && sleep 0' 11728 1726882201.81588: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.81610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.81713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.81735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.81751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.81775: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.81941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.83901: stdout chunk (state=3): >>>ansible-tmp-1726882201.8094811-12864-268324658247286=/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286 <<< 11728 1726882201.84098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.84101: stdout chunk (state=3): >>><<< 11728 1726882201.84104: stderr chunk (state=3): >>><<< 11728 1726882201.84302: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882201.8094811-12864-268324658247286=/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.84305: variable 'ansible_module_compression' from source: unknown 11728 1726882201.84308: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882201.84309: variable 'ansible_facts' from source: unknown 11728 1726882201.84311: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py 11728 1726882201.84616: Sending initial data 11728 1726882201.84622: Sent initial data (156 bytes) 11728 1726882201.85174: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.85183: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.85195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.85214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.85227: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882201.85235: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882201.85249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.85263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882201.85270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882201.85282: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882201.85358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.85379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.85383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.85409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.85489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.87145: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882201.87216: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882201.87225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882201.87313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpycczwean /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py <<< 11728 1726882201.87317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py" <<< 11728 1726882201.87368: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpycczwean" to remote "/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py" <<< 11728 1726882201.88152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.88257: stderr chunk (state=3): >>><<< 11728 1726882201.88260: stdout chunk (state=3): >>><<< 11728 1726882201.88262: done transferring module to remote 11728 1726882201.88275: _low_level_execute_command(): starting 11728 1726882201.88285: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/ /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py && sleep 0' 11728 1726882201.88931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.89016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.89050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.89065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.89087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.89174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882201.91013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882201.91029: stderr chunk (state=3): >>><<< 11728 1726882201.91040: stdout chunk (state=3): >>><<< 11728 1726882201.91156: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882201.91160: _low_level_execute_command(): starting 11728 1726882201.91163: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/AnsiballZ_command.py && sleep 0' 11728 1726882201.91731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882201.91746: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882201.91762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882201.91779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882201.91816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882201.91905: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882201.91933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882201.91952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882201.92030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.07665: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 21:30:02.071640", "end": "2024-09-20 21:30:02.074958", "delta": "0:00:00.003318", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882202.09310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882202.09314: stdout chunk (state=3): >>><<< 11728 1726882202.09330: stderr chunk (state=3): >>><<< 11728 1726882202.09416: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/downdelay"], "start": "2024-09-20 21:30:02.071640", "end": "2024-09-20 21:30:02.074958", "delta": "0:00:00.003318", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/downdelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882202.09420: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/downdelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882202.09423: _low_level_execute_command(): starting 11728 1726882202.09425: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882201.8094811-12864-268324658247286/ > /dev/null 2>&1 && sleep 0' 11728 1726882202.10010: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.10028: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.10044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.10076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882202.10174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.10203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.10222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.10308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.12168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.12179: stdout chunk (state=3): >>><<< 11728 1726882202.12188: stderr chunk (state=3): >>><<< 11728 1726882202.12210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.12219: handler run complete 11728 1726882202.12244: Evaluated conditional (False): False 11728 1726882202.12411: variable 'bond_opt' from source: unknown 11728 1726882202.12598: variable 'result' from source: unknown 11728 1726882202.12602: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882202.12604: attempt loop complete, returning result 11728 1726882202.12607: variable 'bond_opt' from source: unknown 11728 1726882202.12609: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'downdelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "downdelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/downdelay" ], "delta": "0:00:00.003318", "end": "2024-09-20 21:30:02.074958", "rc": 0, "start": "2024-09-20 21:30:02.071640" } STDOUT: 0 11728 1726882202.12827: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.12831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.12833: variable 'omit' from source: magic vars 11728 1726882202.12970: variable 'ansible_distribution_major_version' from source: facts 11728 1726882202.12981: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882202.12990: variable 'omit' from source: magic vars 11728 1726882202.13012: variable 'omit' from source: magic vars 11728 1726882202.13180: variable 'controller_device' from source: play vars 11728 1726882202.13190: variable 'bond_opt' from source: unknown 11728 1726882202.13214: variable 'omit' from source: magic vars 11728 1726882202.13263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882202.13266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.13268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.13281: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882202.13290: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.13373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.13384: Set connection var ansible_connection to ssh 11728 1726882202.13400: Set connection var ansible_shell_executable to /bin/sh 11728 1726882202.13411: Set connection var ansible_timeout to 10 11728 1726882202.13418: Set connection var ansible_shell_type to sh 11728 1726882202.13429: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882202.13438: Set connection var ansible_pipelining to False 11728 1726882202.13463: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.13471: variable 'ansible_connection' from source: unknown 11728 1726882202.13486: variable 'ansible_module_compression' from source: unknown 11728 1726882202.13495: variable 'ansible_shell_type' from source: unknown 11728 1726882202.13503: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.13512: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.13590: variable 'ansible_pipelining' from source: unknown 11728 1726882202.13592: variable 'ansible_timeout' from source: unknown 11728 1726882202.13597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.13828: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882202.13841: variable 'omit' from source: magic vars 11728 1726882202.13849: starting attempt loop 11728 1726882202.13855: running the handler 11728 1726882202.13865: _low_level_execute_command(): starting 11728 1726882202.13872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882202.14850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.14863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.14887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.14894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882202.14904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.14910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.15037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.15091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.16726: stdout chunk (state=3): >>>/root <<< 11728 1726882202.17006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.17013: stdout chunk (state=3): >>><<< 11728 1726882202.17015: stderr chunk (state=3): >>><<< 11728 1726882202.17017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.17022: _low_level_execute_command(): starting 11728 1726882202.17024: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097 `" && echo ansible-tmp-1726882202.1698282-12864-248676560786097="` echo /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097 `" ) && sleep 0' 11728 1726882202.18043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.18058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.18078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.18096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.18118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882202.18130: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882202.18227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.18247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.18349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.20258: stdout chunk (state=3): >>>ansible-tmp-1726882202.1698282-12864-248676560786097=/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097 <<< 11728 1726882202.20418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.20423: stdout chunk (state=3): >>><<< 11728 1726882202.20425: stderr chunk (state=3): >>><<< 11728 1726882202.20440: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882202.1698282-12864-248676560786097=/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.20606: variable 'ansible_module_compression' from source: unknown 11728 1726882202.20609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882202.20611: variable 'ansible_facts' from source: unknown 11728 1726882202.20746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py 11728 1726882202.21116: Sending initial data 11728 1726882202.21125: Sent initial data (156 bytes) 11728 1726882202.22179: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.22385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.22415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.22441: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.22485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.22526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.24099: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11728 1726882202.24238: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882202.24282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882202.24356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqek9f9jt /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py <<< 11728 1726882202.24368: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py" <<< 11728 1726882202.24404: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqek9f9jt" to remote "/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py" <<< 11728 1726882202.24416: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py" <<< 11728 1726882202.25725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.25736: stderr chunk (state=3): >>><<< 11728 1726882202.25744: stdout chunk (state=3): >>><<< 11728 1726882202.25943: done transferring module to remote 11728 1726882202.25946: _low_level_execute_command(): starting 11728 1726882202.25948: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/ /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py && sleep 0' 11728 1726882202.26815: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.26819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.26858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.26870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.26885: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.27099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.28758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.28882: stderr chunk (state=3): >>><<< 11728 1726882202.28885: stdout chunk (state=3): >>><<< 11728 1726882202.28904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.28907: _low_level_execute_command(): starting 11728 1726882202.28913: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/AnsiballZ_command.py && sleep 0' 11728 1726882202.29986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.30021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.30078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.30081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882202.30083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882202.30085: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.30087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.30164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.30167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.30208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.30256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.45665: stdout chunk (state=3): >>> {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 21:30:02.452002", "end": "2024-09-20 21:30:02.455041", "delta": "0:00:00.003039", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882202.47237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882202.47241: stdout chunk (state=3): >>><<< 11728 1726882202.47299: stderr chunk (state=3): >>><<< 11728 1726882202.47303: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "slow 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lacp_rate"], "start": "2024-09-20 21:30:02.452002", "end": "2024-09-20 21:30:02.455041", "delta": "0:00:00.003039", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lacp_rate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882202.47306: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lacp_rate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882202.47312: _low_level_execute_command(): starting 11728 1726882202.47315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882202.1698282-12864-248676560786097/ > /dev/null 2>&1 && sleep 0' 11728 1726882202.48015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.48075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.48087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.48119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.48197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.50203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.50207: stdout chunk (state=3): >>><<< 11728 1726882202.50210: stderr chunk (state=3): >>><<< 11728 1726882202.50212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.50215: handler run complete 11728 1726882202.50217: Evaluated conditional (False): False 11728 1726882202.50269: variable 'bond_opt' from source: unknown 11728 1726882202.50274: variable 'result' from source: unknown 11728 1726882202.50291: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882202.50305: attempt loop complete, returning result 11728 1726882202.50322: variable 'bond_opt' from source: unknown 11728 1726882202.50385: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lacp_rate', 'value': 'slow'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lacp_rate", "value": "slow" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lacp_rate" ], "delta": "0:00:00.003039", "end": "2024-09-20 21:30:02.455041", "rc": 0, "start": "2024-09-20 21:30:02.452002" } STDOUT: slow 0 11728 1726882202.50548: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.50551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.50554: variable 'omit' from source: magic vars 11728 1726882202.50704: variable 'ansible_distribution_major_version' from source: facts 11728 1726882202.50707: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882202.50710: variable 'omit' from source: magic vars 11728 1726882202.50712: variable 'omit' from source: magic vars 11728 1726882202.50834: variable 'controller_device' from source: play vars 11728 1726882202.50837: variable 'bond_opt' from source: unknown 11728 1726882202.50853: variable 'omit' from source: magic vars 11728 1726882202.50872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882202.50940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.50943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.50945: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882202.50948: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.50950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.50975: Set connection var ansible_connection to ssh 11728 1726882202.50983: Set connection var ansible_shell_executable to /bin/sh 11728 1726882202.50988: Set connection var ansible_timeout to 10 11728 1726882202.50990: Set connection var ansible_shell_type to sh 11728 1726882202.51105: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882202.51108: Set connection var ansible_pipelining to False 11728 1726882202.51110: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.51112: variable 'ansible_connection' from source: unknown 11728 1726882202.51115: variable 'ansible_module_compression' from source: unknown 11728 1726882202.51117: variable 'ansible_shell_type' from source: unknown 11728 1726882202.51119: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.51121: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.51123: variable 'ansible_pipelining' from source: unknown 11728 1726882202.51125: variable 'ansible_timeout' from source: unknown 11728 1726882202.51127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.51136: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882202.51247: variable 'omit' from source: magic vars 11728 1726882202.51249: starting attempt loop 11728 1726882202.51251: running the handler 11728 1726882202.51253: _low_level_execute_command(): starting 11728 1726882202.51255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882202.51731: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.51742: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.51752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.51839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.51842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882202.51844: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882202.51850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.51853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882202.51899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.51910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.51982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.53580: stdout chunk (state=3): >>>/root <<< 11728 1726882202.53799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.53802: stdout chunk (state=3): >>><<< 11728 1726882202.53804: stderr chunk (state=3): >>><<< 11728 1726882202.53807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.53809: _low_level_execute_command(): starting 11728 1726882202.53811: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171 `" && echo ansible-tmp-1726882202.5375135-12864-246229750974171="` echo /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171 `" ) && sleep 0' 11728 1726882202.54419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.54429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.54439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.54460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.54472: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882202.54478: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882202.54487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.54504: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882202.54511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882202.54517: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882202.54525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.54673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.54681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.54683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.54685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.54687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.54718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.56585: stdout chunk (state=3): >>>ansible-tmp-1726882202.5375135-12864-246229750974171=/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171 <<< 11728 1726882202.56759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.56763: stdout chunk (state=3): >>><<< 11728 1726882202.56765: stderr chunk (state=3): >>><<< 11728 1726882202.56999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882202.5375135-12864-246229750974171=/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.57003: variable 'ansible_module_compression' from source: unknown 11728 1726882202.57006: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882202.57008: variable 'ansible_facts' from source: unknown 11728 1726882202.57010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py 11728 1726882202.57152: Sending initial data 11728 1726882202.57155: Sent initial data (156 bytes) 11728 1726882202.57700: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.57800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.57807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.57880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.59445: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882202.59474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882202.59533: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpslqyxgqs /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py <<< 11728 1726882202.59536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py" <<< 11728 1726882202.60001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpslqyxgqs" to remote "/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py" <<< 11728 1726882202.61280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.61315: stderr chunk (state=3): >>><<< 11728 1726882202.61434: stdout chunk (state=3): >>><<< 11728 1726882202.61437: done transferring module to remote 11728 1726882202.61440: _low_level_execute_command(): starting 11728 1726882202.61442: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/ /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py && sleep 0' 11728 1726882202.62096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.62129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.62205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.64358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.64362: stdout chunk (state=3): >>><<< 11728 1726882202.64364: stderr chunk (state=3): >>><<< 11728 1726882202.64367: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.64369: _low_level_execute_command(): starting 11728 1726882202.64371: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/AnsiballZ_command.py && sleep 0' 11728 1726882202.65556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.65702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.65865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.66381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.81810: stdout chunk (state=3): >>> {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 21:30:02.813197", "end": "2024-09-20 21:30:02.816405", "delta": "0:00:00.003208", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882202.83367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882202.83371: stdout chunk (state=3): >>><<< 11728 1726882202.83378: stderr chunk (state=3): >>><<< 11728 1726882202.83397: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/lp_interval"], "start": "2024-09-20 21:30:02.813197", "end": "2024-09-20 21:30:02.816405", "delta": "0:00:00.003208", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/lp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882202.83427: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/lp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882202.83430: _low_level_execute_command(): starting 11728 1726882202.83437: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882202.5375135-12864-246229750974171/ > /dev/null 2>&1 && sleep 0' 11728 1726882202.84012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.84034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.84037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.84199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.84203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882202.84205: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882202.84207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.84210: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882202.84212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882202.84214: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882202.84215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.84217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.84219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.84221: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882202.84223: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882202.84225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.84227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.84229: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.84253: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.84319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.86153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.86185: stderr chunk (state=3): >>><<< 11728 1726882202.86188: stdout chunk (state=3): >>><<< 11728 1726882202.86209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.86213: handler run complete 11728 1726882202.86239: Evaluated conditional (False): False 11728 1726882202.86400: variable 'bond_opt' from source: unknown 11728 1726882202.86403: variable 'result' from source: unknown 11728 1726882202.86438: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882202.86441: attempt loop complete, returning result 11728 1726882202.86448: variable 'bond_opt' from source: unknown 11728 1726882202.86518: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'lp_interval', 'value': '128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "lp_interval", "value": "128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/lp_interval" ], "delta": "0:00:00.003208", "end": "2024-09-20 21:30:02.816405", "rc": 0, "start": "2024-09-20 21:30:02.813197" } STDOUT: 128 11728 1726882202.86801: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.86805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.86808: variable 'omit' from source: magic vars 11728 1726882202.86810: variable 'ansible_distribution_major_version' from source: facts 11728 1726882202.86812: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882202.86814: variable 'omit' from source: magic vars 11728 1726882202.86816: variable 'omit' from source: magic vars 11728 1726882202.87039: variable 'controller_device' from source: play vars 11728 1726882202.87043: variable 'bond_opt' from source: unknown 11728 1726882202.87045: variable 'omit' from source: magic vars 11728 1726882202.87048: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882202.87050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.87052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882202.87054: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882202.87056: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.87058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.87129: Set connection var ansible_connection to ssh 11728 1726882202.87138: Set connection var ansible_shell_executable to /bin/sh 11728 1726882202.87144: Set connection var ansible_timeout to 10 11728 1726882202.87146: Set connection var ansible_shell_type to sh 11728 1726882202.87153: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882202.87158: Set connection var ansible_pipelining to False 11728 1726882202.87179: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.87182: variable 'ansible_connection' from source: unknown 11728 1726882202.87184: variable 'ansible_module_compression' from source: unknown 11728 1726882202.87186: variable 'ansible_shell_type' from source: unknown 11728 1726882202.87189: variable 'ansible_shell_executable' from source: unknown 11728 1726882202.87196: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882202.87199: variable 'ansible_pipelining' from source: unknown 11728 1726882202.87201: variable 'ansible_timeout' from source: unknown 11728 1726882202.87235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882202.87302: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882202.87327: variable 'omit' from source: magic vars 11728 1726882202.87330: starting attempt loop 11728 1726882202.87332: running the handler 11728 1726882202.87334: _low_level_execute_command(): starting 11728 1726882202.87336: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882202.87954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.87979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882202.88074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.88098: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.88174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.89828: stdout chunk (state=3): >>>/root <<< 11728 1726882202.89965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.89969: stdout chunk (state=3): >>><<< 11728 1726882202.89971: stderr chunk (state=3): >>><<< 11728 1726882202.89986: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.90073: _low_level_execute_command(): starting 11728 1726882202.90077: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321 `" && echo ansible-tmp-1726882202.8999114-12864-43643536275321="` echo /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321 `" ) && sleep 0' 11728 1726882202.90615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.90628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.90709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.90737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.90751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.90768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.90915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.92822: stdout chunk (state=3): >>>ansible-tmp-1726882202.8999114-12864-43643536275321=/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321 <<< 11728 1726882202.92972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.92975: stdout chunk (state=3): >>><<< 11728 1726882202.92978: stderr chunk (state=3): >>><<< 11728 1726882202.93100: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882202.8999114-12864-43643536275321=/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.93104: variable 'ansible_module_compression' from source: unknown 11728 1726882202.93106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882202.93108: variable 'ansible_facts' from source: unknown 11728 1726882202.93157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py 11728 1726882202.93348: Sending initial data 11728 1726882202.93351: Sent initial data (155 bytes) 11728 1726882202.93887: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882202.93907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882202.93924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882202.93991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.94044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.94061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.94086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.94171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.95707: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882202.95731: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882202.95785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882202.95841: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpan8gunuf /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py <<< 11728 1726882202.95845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py" <<< 11728 1726882202.95886: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpan8gunuf" to remote "/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py" <<< 11728 1726882202.96900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.96903: stderr chunk (state=3): >>><<< 11728 1726882202.96905: stdout chunk (state=3): >>><<< 11728 1726882202.96907: done transferring module to remote 11728 1726882202.96909: _low_level_execute_command(): starting 11728 1726882202.96911: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/ /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py && sleep 0' 11728 1726882202.97438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.97451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882202.97513: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882202.97554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882202.97566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882202.97583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882202.97663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882202.99601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882202.99604: stderr chunk (state=3): >>><<< 11728 1726882202.99607: stdout chunk (state=3): >>><<< 11728 1726882202.99609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882202.99612: _low_level_execute_command(): starting 11728 1726882202.99614: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/AnsiballZ_command.py && sleep 0' 11728 1726882203.00038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.00048: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.00058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.00078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.00090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882203.00100: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882203.00111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.00125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882203.00133: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882203.00139: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882203.00147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.00157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.00186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.00250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.00277: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.00337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.15525: stdout chunk (state=3): >>> {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 21:30:03.150014", "end": "2024-09-20 21:30:03.153227", "delta": "0:00:00.003213", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882203.17290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882203.17299: stdout chunk (state=3): >>><<< 11728 1726882203.17302: stderr chunk (state=3): >>><<< 11728 1726882203.17304: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "110", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/miimon"], "start": "2024-09-20 21:30:03.150014", "end": "2024-09-20 21:30:03.153227", "delta": "0:00:00.003213", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/miimon", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882203.17307: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/miimon', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882203.17309: _low_level_execute_command(): starting 11728 1726882203.17311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882202.8999114-12864-43643536275321/ > /dev/null 2>&1 && sleep 0' 11728 1726882203.17880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.17904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.17920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.17937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.18043: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.18330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.18397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.20241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.20249: stdout chunk (state=3): >>><<< 11728 1726882203.20257: stderr chunk (state=3): >>><<< 11728 1726882203.20274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.20282: handler run complete 11728 1726882203.20307: Evaluated conditional (False): False 11728 1726882203.20446: variable 'bond_opt' from source: unknown 11728 1726882203.20460: variable 'result' from source: unknown 11728 1726882203.20475: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882203.20598: attempt loop complete, returning result 11728 1726882203.20601: variable 'bond_opt' from source: unknown 11728 1726882203.20603: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'miimon', 'value': '110'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "miimon", "value": "110" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/miimon" ], "delta": "0:00:00.003213", "end": "2024-09-20 21:30:03.153227", "rc": 0, "start": "2024-09-20 21:30:03.150014" } STDOUT: 110 11728 1726882203.20770: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.20821: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.20824: variable 'omit' from source: magic vars 11728 1726882203.20962: variable 'ansible_distribution_major_version' from source: facts 11728 1726882203.20973: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882203.20981: variable 'omit' from source: magic vars 11728 1726882203.21003: variable 'omit' from source: magic vars 11728 1726882203.21181: variable 'controller_device' from source: play vars 11728 1726882203.21191: variable 'bond_opt' from source: unknown 11728 1726882203.21256: variable 'omit' from source: magic vars 11728 1726882203.21259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882203.21262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.21264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.21278: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882203.21288: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.21300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.21376: Set connection var ansible_connection to ssh 11728 1726882203.21390: Set connection var ansible_shell_executable to /bin/sh 11728 1726882203.21474: Set connection var ansible_timeout to 10 11728 1726882203.21477: Set connection var ansible_shell_type to sh 11728 1726882203.21479: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882203.21481: Set connection var ansible_pipelining to False 11728 1726882203.21483: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.21485: variable 'ansible_connection' from source: unknown 11728 1726882203.21487: variable 'ansible_module_compression' from source: unknown 11728 1726882203.21489: variable 'ansible_shell_type' from source: unknown 11728 1726882203.21491: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.21492: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.21496: variable 'ansible_pipelining' from source: unknown 11728 1726882203.21499: variable 'ansible_timeout' from source: unknown 11728 1726882203.21501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.21572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882203.21591: variable 'omit' from source: magic vars 11728 1726882203.21602: starting attempt loop 11728 1726882203.21609: running the handler 11728 1726882203.21621: _low_level_execute_command(): starting 11728 1726882203.21630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882203.22355: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.22370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.22386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.22406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.22462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.22520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.22586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.22648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.24197: stdout chunk (state=3): >>>/root <<< 11728 1726882203.24336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.24346: stdout chunk (state=3): >>><<< 11728 1726882203.24362: stderr chunk (state=3): >>><<< 11728 1726882203.24398: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.24402: _low_level_execute_command(): starting 11728 1726882203.24404: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499 `" && echo ansible-tmp-1726882203.2437847-12864-21785208264499="` echo /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499 `" ) && sleep 0' 11728 1726882203.25113: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.25130: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.25151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.25169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.25188: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882203.25206: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882203.25281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.25324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.25342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.25372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.25524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.27384: stdout chunk (state=3): >>>ansible-tmp-1726882203.2437847-12864-21785208264499=/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499 <<< 11728 1726882203.27516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.27559: stderr chunk (state=3): >>><<< 11728 1726882203.27562: stdout chunk (state=3): >>><<< 11728 1726882203.27772: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882203.2437847-12864-21785208264499=/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.27776: variable 'ansible_module_compression' from source: unknown 11728 1726882203.27778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882203.27780: variable 'ansible_facts' from source: unknown 11728 1726882203.27782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py 11728 1726882203.28019: Sending initial data 11728 1726882203.28120: Sent initial data (155 bytes) 11728 1726882203.28565: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.28607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.28679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.28708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.28725: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.28821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.30318: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882203.30326: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11728 1726882203.30333: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11728 1726882203.30340: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11728 1726882203.30348: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11728 1726882203.30359: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882203.30433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882203.30484: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpg2lmlp4w /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py <<< 11728 1726882203.30488: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py" <<< 11728 1726882203.30525: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpg2lmlp4w" to remote "/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py" <<< 11728 1726882203.31348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.31351: stdout chunk (state=3): >>><<< 11728 1726882203.31353: stderr chunk (state=3): >>><<< 11728 1726882203.31360: done transferring module to remote 11728 1726882203.31370: _low_level_execute_command(): starting 11728 1726882203.31377: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/ /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py && sleep 0' 11728 1726882203.31965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.31978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.32012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882203.32113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.32136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.32208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.33952: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.33982: stdout chunk (state=3): >>><<< 11728 1726882203.33985: stderr chunk (state=3): >>><<< 11728 1726882203.34004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.34088: _low_level_execute_command(): starting 11728 1726882203.34092: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/AnsiballZ_command.py && sleep 0' 11728 1726882203.34654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.34670: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.34687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.34767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.34810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.34830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.34852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.34933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.50419: stdout chunk (state=3): >>> {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 21:30:03.498996", "end": "2024-09-20 21:30:03.502031", "delta": "0:00:00.003035", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882203.51931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882203.51935: stdout chunk (state=3): >>><<< 11728 1726882203.51937: stderr chunk (state=3): >>><<< 11728 1726882203.51940: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "64", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/num_grat_arp"], "start": "2024-09-20 21:30:03.498996", "end": "2024-09-20 21:30:03.502031", "delta": "0:00:00.003035", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/num_grat_arp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882203.51997: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/num_grat_arp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882203.52001: _low_level_execute_command(): starting 11728 1726882203.52003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882203.2437847-12864-21785208264499/ > /dev/null 2>&1 && sleep 0' 11728 1726882203.53064: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.53072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.53113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.53123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.55055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.55058: stdout chunk (state=3): >>><<< 11728 1726882203.55065: stderr chunk (state=3): >>><<< 11728 1726882203.55080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.55083: handler run complete 11728 1726882203.55216: Evaluated conditional (False): False 11728 1726882203.55473: variable 'bond_opt' from source: unknown 11728 1726882203.55479: variable 'result' from source: unknown 11728 1726882203.55496: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882203.55510: attempt loop complete, returning result 11728 1726882203.55610: variable 'bond_opt' from source: unknown 11728 1726882203.55929: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'num_grat_arp', 'value': '64'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "num_grat_arp", "value": "64" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/num_grat_arp" ], "delta": "0:00:00.003035", "end": "2024-09-20 21:30:03.502031", "rc": 0, "start": "2024-09-20 21:30:03.498996" } STDOUT: 64 11728 1726882203.56458: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.56461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.56463: variable 'omit' from source: magic vars 11728 1726882203.56540: variable 'ansible_distribution_major_version' from source: facts 11728 1726882203.56546: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882203.56550: variable 'omit' from source: magic vars 11728 1726882203.56564: variable 'omit' from source: magic vars 11728 1726882203.56980: variable 'controller_device' from source: play vars 11728 1726882203.56983: variable 'bond_opt' from source: unknown 11728 1726882203.57003: variable 'omit' from source: magic vars 11728 1726882203.57024: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882203.57032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.57038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.57107: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882203.57110: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.57112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.57300: Set connection var ansible_connection to ssh 11728 1726882203.57319: Set connection var ansible_shell_executable to /bin/sh 11728 1726882203.57322: Set connection var ansible_timeout to 10 11728 1726882203.57324: Set connection var ansible_shell_type to sh 11728 1726882203.57326: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882203.57328: Set connection var ansible_pipelining to False 11728 1726882203.57344: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.57347: variable 'ansible_connection' from source: unknown 11728 1726882203.57349: variable 'ansible_module_compression' from source: unknown 11728 1726882203.57352: variable 'ansible_shell_type' from source: unknown 11728 1726882203.57354: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.57356: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.57538: variable 'ansible_pipelining' from source: unknown 11728 1726882203.57541: variable 'ansible_timeout' from source: unknown 11728 1726882203.57543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.57769: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882203.57772: variable 'omit' from source: magic vars 11728 1726882203.57774: starting attempt loop 11728 1726882203.57777: running the handler 11728 1726882203.57778: _low_level_execute_command(): starting 11728 1726882203.57781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882203.59103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.59120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.59210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.59406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.59424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.59504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.61060: stdout chunk (state=3): >>>/root <<< 11728 1726882203.61195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.61205: stdout chunk (state=3): >>><<< 11728 1726882203.61238: stderr chunk (state=3): >>><<< 11728 1726882203.61257: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.61306: _low_level_execute_command(): starting 11728 1726882203.61315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863 `" && echo ansible-tmp-1726882203.6129606-12864-202418770635863="` echo /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863 `" ) && sleep 0' 11728 1726882203.62527: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.62530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.62532: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.62539: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882203.62541: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.62544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.62613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.62818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.62883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.64752: stdout chunk (state=3): >>>ansible-tmp-1726882203.6129606-12864-202418770635863=/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863 <<< 11728 1726882203.64916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.64977: stderr chunk (state=3): >>><<< 11728 1726882203.65005: stdout chunk (state=3): >>><<< 11728 1726882203.65081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882203.6129606-12864-202418770635863=/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.65110: variable 'ansible_module_compression' from source: unknown 11728 1726882203.65212: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882203.65238: variable 'ansible_facts' from source: unknown 11728 1726882203.65415: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py 11728 1726882203.66001: Sending initial data 11728 1726882203.66005: Sent initial data (156 bytes) 11728 1726882203.66826: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.66841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.66950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.66954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.66973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882203.66979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.67056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882203.67059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.67309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.68737: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882203.68745: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11728 1726882203.68752: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11728 1726882203.68760: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11728 1726882203.68766: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11728 1726882203.68773: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11728 1726882203.68782: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11728 1726882203.68785: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11728 1726882203.68797: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882203.68805: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11728 1726882203.68814: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882203.68892: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882203.68925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_ankljh3 /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py <<< 11728 1726882203.68935: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py" <<< 11728 1726882203.69000: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11728 1726882203.69007: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_ankljh3" to remote "/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py" <<< 11728 1726882203.69150: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py" <<< 11728 1726882203.70636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.70639: stderr chunk (state=3): >>><<< 11728 1726882203.70641: stdout chunk (state=3): >>><<< 11728 1726882203.70652: done transferring module to remote 11728 1726882203.70660: _low_level_execute_command(): starting 11728 1726882203.70665: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/ /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py && sleep 0' 11728 1726882203.72299: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.72303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882203.72305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.72308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882203.72310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.72612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.72647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.74473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.74476: stdout chunk (state=3): >>><<< 11728 1726882203.74479: stderr chunk (state=3): >>><<< 11728 1726882203.74498: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.74516: _low_level_execute_command(): starting 11728 1726882203.74527: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/AnsiballZ_command.py && sleep 0' 11728 1726882203.75974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.75986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.76001: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.76400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.76412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.76743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.76839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.92121: stdout chunk (state=3): >>> {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 21:30:03.916511", "end": "2024-09-20 21:30:03.919594", "delta": "0:00:00.003083", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882203.93676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882203.93681: stdout chunk (state=3): >>><<< 11728 1726882203.93683: stderr chunk (state=3): >>><<< 11728 1726882203.93710: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "225", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/resend_igmp"], "start": "2024-09-20 21:30:03.916511", "end": "2024-09-20 21:30:03.919594", "delta": "0:00:00.003083", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/resend_igmp", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882203.93737: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/resend_igmp', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882203.93741: _low_level_execute_command(): starting 11728 1726882203.93747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882203.6129606-12864-202418770635863/ > /dev/null 2>&1 && sleep 0' 11728 1726882203.94349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882203.94358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.94369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.94383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.94401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882203.94409: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882203.94499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.94503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882203.94505: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882203.94507: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882203.94509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882203.94511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882203.94512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882203.94514: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882203.94516: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882203.94518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.94563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.94617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.94620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.94672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882203.96517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882203.96529: stderr chunk (state=3): >>><<< 11728 1726882203.96537: stdout chunk (state=3): >>><<< 11728 1726882203.96557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882203.96570: handler run complete 11728 1726882203.96597: Evaluated conditional (False): False 11728 1726882203.96789: variable 'bond_opt' from source: unknown 11728 1726882203.96792: variable 'result' from source: unknown 11728 1726882203.96799: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882203.96801: attempt loop complete, returning result 11728 1726882203.96820: variable 'bond_opt' from source: unknown 11728 1726882203.96887: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'resend_igmp', 'value': '225'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "resend_igmp", "value": "225" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/resend_igmp" ], "delta": "0:00:00.003083", "end": "2024-09-20 21:30:03.919594", "rc": 0, "start": "2024-09-20 21:30:03.916511" } STDOUT: 225 11728 1726882203.97230: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.97233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.97236: variable 'omit' from source: magic vars 11728 1726882203.97321: variable 'ansible_distribution_major_version' from source: facts 11728 1726882203.97332: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882203.97341: variable 'omit' from source: magic vars 11728 1726882203.97365: variable 'omit' from source: magic vars 11728 1726882203.97531: variable 'controller_device' from source: play vars 11728 1726882203.97541: variable 'bond_opt' from source: unknown 11728 1726882203.97601: variable 'omit' from source: magic vars 11728 1726882203.97604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882203.97609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.97622: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882203.97640: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882203.97648: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.97656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.97787: Set connection var ansible_connection to ssh 11728 1726882203.97790: Set connection var ansible_shell_executable to /bin/sh 11728 1726882203.97794: Set connection var ansible_timeout to 10 11728 1726882203.97797: Set connection var ansible_shell_type to sh 11728 1726882203.97799: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882203.97801: Set connection var ansible_pipelining to False 11728 1726882203.97803: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.97811: variable 'ansible_connection' from source: unknown 11728 1726882203.97818: variable 'ansible_module_compression' from source: unknown 11728 1726882203.97825: variable 'ansible_shell_type' from source: unknown 11728 1726882203.97832: variable 'ansible_shell_executable' from source: unknown 11728 1726882203.97843: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882203.97851: variable 'ansible_pipelining' from source: unknown 11728 1726882203.97858: variable 'ansible_timeout' from source: unknown 11728 1726882203.97866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882203.98001: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882203.98004: variable 'omit' from source: magic vars 11728 1726882203.98006: starting attempt loop 11728 1726882203.98008: running the handler 11728 1726882203.98010: _low_level_execute_command(): starting 11728 1726882203.98012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882203.98663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882203.98727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882203.98746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882203.98784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882203.98867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.00485: stdout chunk (state=3): >>>/root <<< 11728 1726882204.00869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.00890: stdout chunk (state=3): >>><<< 11728 1726882204.00895: stderr chunk (state=3): >>><<< 11728 1726882204.00911: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.01000: _low_level_execute_command(): starting 11728 1726882204.01003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967 `" && echo ansible-tmp-1726882204.0091617-12864-225298688725967="` echo /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967 `" ) && sleep 0' 11728 1726882204.01592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.01663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.01697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.01720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.01834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.03665: stdout chunk (state=3): >>>ansible-tmp-1726882204.0091617-12864-225298688725967=/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967 <<< 11728 1726882204.03831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.03834: stdout chunk (state=3): >>><<< 11728 1726882204.03837: stderr chunk (state=3): >>><<< 11728 1726882204.03999: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882204.0091617-12864-225298688725967=/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.04002: variable 'ansible_module_compression' from source: unknown 11728 1726882204.04004: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882204.04006: variable 'ansible_facts' from source: unknown 11728 1726882204.04019: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py 11728 1726882204.04145: Sending initial data 11728 1726882204.04155: Sent initial data (156 bytes) 11728 1726882204.04828: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.04909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.04947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.04962: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.04984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.05232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.06724: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882204.06732: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11728 1726882204.06746: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882204.06808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882204.06880: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpo1gm4xvu /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py <<< 11728 1726882204.06883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py" <<< 11728 1726882204.06925: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpo1gm4xvu" to remote "/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py" <<< 11728 1726882204.07799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.07802: stderr chunk (state=3): >>><<< 11728 1726882204.07804: stdout chunk (state=3): >>><<< 11728 1726882204.07828: done transferring module to remote 11728 1726882204.07836: _low_level_execute_command(): starting 11728 1726882204.07841: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/ /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py && sleep 0' 11728 1726882204.08444: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.08454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.08508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.08570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.08730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.08921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.10662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.10665: stdout chunk (state=3): >>><<< 11728 1726882204.10771: stderr chunk (state=3): >>><<< 11728 1726882204.10774: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.10777: _low_level_execute_command(): starting 11728 1726882204.10780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/AnsiballZ_command.py && sleep 0' 11728 1726882204.11333: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.11348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.11364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.11381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882204.11407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882204.11421: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882204.11436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.11509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.11551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.11592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.11611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.11782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.27113: stdout chunk (state=3): >>> {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 21:30:04.266337", "end": "2024-09-20 21:30:04.269463", "delta": "0:00:00.003126", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882204.28662: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882204.28719: stderr chunk (state=3): >>><<< 11728 1726882204.28765: stdout chunk (state=3): >>><<< 11728 1726882204.28787: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/updelay"], "start": "2024-09-20 21:30:04.266337", "end": "2024-09-20 21:30:04.269463", "delta": "0:00:00.003126", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/updelay", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882204.28834: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/updelay', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882204.28850: _low_level_execute_command(): starting 11728 1726882204.28901: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882204.0091617-12864-225298688725967/ > /dev/null 2>&1 && sleep 0' 11728 1726882204.30332: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.30348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.30363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.30401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.30499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.30547: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.30569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.30683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.32576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.32639: stdout chunk (state=3): >>><<< 11728 1726882204.32643: stderr chunk (state=3): >>><<< 11728 1726882204.32659: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.32669: handler run complete 11728 1726882204.33020: Evaluated conditional (False): False 11728 1726882204.33087: variable 'bond_opt' from source: unknown 11728 1726882204.33103: variable 'result' from source: unknown 11728 1726882204.33137: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882204.33162: attempt loop complete, returning result 11728 1726882204.33186: variable 'bond_opt' from source: unknown 11728 1726882204.33272: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'updelay', 'value': '0'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "updelay", "value": "0" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/updelay" ], "delta": "0:00:00.003126", "end": "2024-09-20 21:30:04.269463", "rc": 0, "start": "2024-09-20 21:30:04.266337" } STDOUT: 0 11728 1726882204.33578: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.33667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.33670: variable 'omit' from source: magic vars 11728 1726882204.33796: variable 'ansible_distribution_major_version' from source: facts 11728 1726882204.33811: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882204.33820: variable 'omit' from source: magic vars 11728 1726882204.33838: variable 'omit' from source: magic vars 11728 1726882204.34241: variable 'controller_device' from source: play vars 11728 1726882204.34243: variable 'bond_opt' from source: unknown 11728 1726882204.34245: variable 'omit' from source: magic vars 11728 1726882204.34247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882204.34249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882204.34251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882204.34253: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882204.34254: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.34256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.34323: Set connection var ansible_connection to ssh 11728 1726882204.34331: Set connection var ansible_shell_executable to /bin/sh 11728 1726882204.34336: Set connection var ansible_timeout to 10 11728 1726882204.34339: Set connection var ansible_shell_type to sh 11728 1726882204.34346: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882204.34351: Set connection var ansible_pipelining to False 11728 1726882204.34371: variable 'ansible_shell_executable' from source: unknown 11728 1726882204.34374: variable 'ansible_connection' from source: unknown 11728 1726882204.34376: variable 'ansible_module_compression' from source: unknown 11728 1726882204.34378: variable 'ansible_shell_type' from source: unknown 11728 1726882204.34381: variable 'ansible_shell_executable' from source: unknown 11728 1726882204.34383: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.34387: variable 'ansible_pipelining' from source: unknown 11728 1726882204.34390: variable 'ansible_timeout' from source: unknown 11728 1726882204.34395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.34482: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882204.34489: variable 'omit' from source: magic vars 11728 1726882204.34492: starting attempt loop 11728 1726882204.34499: running the handler 11728 1726882204.34507: _low_level_execute_command(): starting 11728 1726882204.34510: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882204.35058: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.35067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.35077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.35089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882204.35110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882204.35114: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882204.35121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.35134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882204.35141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882204.35147: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882204.35202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.35205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.35207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882204.35209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882204.35218: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882204.35222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.35258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.35300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.35344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.35424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.36980: stdout chunk (state=3): >>>/root <<< 11728 1726882204.37076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.37123: stderr chunk (state=3): >>><<< 11728 1726882204.37138: stdout chunk (state=3): >>><<< 11728 1726882204.37159: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.37170: _low_level_execute_command(): starting 11728 1726882204.37177: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937 `" && echo ansible-tmp-1726882204.3716319-12864-116885281060937="` echo /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937 `" ) && sleep 0' 11728 1726882204.37798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.37876: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.37931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.37949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.37971: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.38070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.39916: stdout chunk (state=3): >>>ansible-tmp-1726882204.3716319-12864-116885281060937=/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937 <<< 11728 1726882204.40075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.40078: stdout chunk (state=3): >>><<< 11728 1726882204.40081: stderr chunk (state=3): >>><<< 11728 1726882204.40102: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882204.3716319-12864-116885281060937=/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.40133: variable 'ansible_module_compression' from source: unknown 11728 1726882204.40185: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882204.40291: variable 'ansible_facts' from source: unknown 11728 1726882204.40298: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py 11728 1726882204.40440: Sending initial data 11728 1726882204.40443: Sent initial data (156 bytes) 11728 1726882204.41082: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882204.41184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.41227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.41258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.42779: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882204.42842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882204.42916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp57z7wp0m /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py <<< 11728 1726882204.42927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py" <<< 11728 1726882204.42962: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp57z7wp0m" to remote "/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py" <<< 11728 1726882204.43800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.43804: stderr chunk (state=3): >>><<< 11728 1726882204.43806: stdout chunk (state=3): >>><<< 11728 1726882204.43839: done transferring module to remote 11728 1726882204.43847: _low_level_execute_command(): starting 11728 1726882204.43856: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/ /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py && sleep 0' 11728 1726882204.44479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.44510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.44597: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.44622: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.44633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.44704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.46575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.46578: stdout chunk (state=3): >>><<< 11728 1726882204.46581: stderr chunk (state=3): >>><<< 11728 1726882204.46587: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.46589: _low_level_execute_command(): starting 11728 1726882204.46591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/AnsiballZ_command.py && sleep 0' 11728 1726882204.47155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.47300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.47304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.47310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.47312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.47365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.62577: stdout chunk (state=3): >>> {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 21:30:04.620912", "end": "2024-09-20 21:30:04.624033", "delta": "0:00:00.003121", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882204.64203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882204.64207: stdout chunk (state=3): >>><<< 11728 1726882204.64209: stderr chunk (state=3): >>><<< 11728 1726882204.64211: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/use_carrier"], "start": "2024-09-20 21:30:04.620912", "end": "2024-09-20 21:30:04.624033", "delta": "0:00:00.003121", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/use_carrier", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882204.64213: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/use_carrier', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882204.64215: _low_level_execute_command(): starting 11728 1726882204.64217: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882204.3716319-12864-116885281060937/ > /dev/null 2>&1 && sleep 0' 11728 1726882204.64802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.64819: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.64837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.64856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882204.64877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882204.64910: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882204.64992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.65026: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.65109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.66962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.67026: stdout chunk (state=3): >>><<< 11728 1726882204.67059: stderr chunk (state=3): >>><<< 11728 1726882204.67230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.67233: handler run complete 11728 1726882204.67235: Evaluated conditional (False): False 11728 1726882204.67321: variable 'bond_opt' from source: unknown 11728 1726882204.67339: variable 'result' from source: unknown 11728 1726882204.67367: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882204.67383: attempt loop complete, returning result 11728 1726882204.67412: variable 'bond_opt' from source: unknown 11728 1726882204.67491: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'use_carrier', 'value': '1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "use_carrier", "value": "1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/use_carrier" ], "delta": "0:00:00.003121", "end": "2024-09-20 21:30:04.624033", "rc": 0, "start": "2024-09-20 21:30:04.620912" } STDOUT: 1 11728 1726882204.67760: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.67763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.67765: variable 'omit' from source: magic vars 11728 1726882204.67978: variable 'ansible_distribution_major_version' from source: facts 11728 1726882204.67981: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882204.67983: variable 'omit' from source: magic vars 11728 1726882204.67985: variable 'omit' from source: magic vars 11728 1726882204.68142: variable 'controller_device' from source: play vars 11728 1726882204.68152: variable 'bond_opt' from source: unknown 11728 1726882204.68174: variable 'omit' from source: magic vars 11728 1726882204.68208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882204.68229: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882204.68241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882204.68258: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882204.68301: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.68304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.68361: Set connection var ansible_connection to ssh 11728 1726882204.68374: Set connection var ansible_shell_executable to /bin/sh 11728 1726882204.68383: Set connection var ansible_timeout to 10 11728 1726882204.68665: Set connection var ansible_shell_type to sh 11728 1726882204.68668: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882204.68670: Set connection var ansible_pipelining to False 11728 1726882204.68671: variable 'ansible_shell_executable' from source: unknown 11728 1726882204.68673: variable 'ansible_connection' from source: unknown 11728 1726882204.68675: variable 'ansible_module_compression' from source: unknown 11728 1726882204.68676: variable 'ansible_shell_type' from source: unknown 11728 1726882204.68678: variable 'ansible_shell_executable' from source: unknown 11728 1726882204.68679: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882204.68681: variable 'ansible_pipelining' from source: unknown 11728 1726882204.68683: variable 'ansible_timeout' from source: unknown 11728 1726882204.68684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882204.68744: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882204.68756: variable 'omit' from source: magic vars 11728 1726882204.68764: starting attempt loop 11728 1726882204.68778: running the handler 11728 1726882204.68799: _low_level_execute_command(): starting 11728 1726882204.68802: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882204.69846: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.70215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.70288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.71838: stdout chunk (state=3): >>>/root <<< 11728 1726882204.71935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.71987: stderr chunk (state=3): >>><<< 11728 1726882204.72003: stdout chunk (state=3): >>><<< 11728 1726882204.72184: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.72192: _low_level_execute_command(): starting 11728 1726882204.72198: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793 `" && echo ansible-tmp-1726882204.721003-12864-122207557582793="` echo /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793 `" ) && sleep 0' 11728 1726882204.73309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882204.73323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882204.73334: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.73401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.73511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.73611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.73725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.75555: stdout chunk (state=3): >>>ansible-tmp-1726882204.721003-12864-122207557582793=/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793 <<< 11728 1726882204.75738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.75780: stderr chunk (state=3): >>><<< 11728 1726882204.75783: stdout chunk (state=3): >>><<< 11728 1726882204.76025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882204.721003-12864-122207557582793=/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.76029: variable 'ansible_module_compression' from source: unknown 11728 1726882204.76031: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882204.76034: variable 'ansible_facts' from source: unknown 11728 1726882204.76163: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py 11728 1726882204.76608: Sending initial data 11728 1726882204.76659: Sent initial data (155 bytes) 11728 1726882204.78012: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.78152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.78198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.79718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882204.79960: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882204.79984: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882204.80038: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvi1th9nf /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py <<< 11728 1726882204.80041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py" <<< 11728 1726882204.80079: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvi1th9nf" to remote "/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py" <<< 11728 1726882204.82160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.82183: stderr chunk (state=3): >>><<< 11728 1726882204.82187: stdout chunk (state=3): >>><<< 11728 1726882204.82248: done transferring module to remote 11728 1726882204.82257: _low_level_execute_command(): starting 11728 1726882204.82262: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/ /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py && sleep 0' 11728 1726882204.83878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.83888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.83891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.84010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.84119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882204.85910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882204.85931: stderr chunk (state=3): >>><<< 11728 1726882204.86233: stdout chunk (state=3): >>><<< 11728 1726882204.86251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882204.86254: _low_level_execute_command(): starting 11728 1726882204.86258: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/AnsiballZ_command.py && sleep 0' 11728 1726882204.87479: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882204.87701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882204.87705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882204.87707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882204.87731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882204.87924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.03233: stdout chunk (state=3): >>> {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 21:30:05.027682", "end": "2024-09-20 21:30:05.030743", "delta": "0:00:00.003061", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882205.04732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882205.04736: stdout chunk (state=3): >>><<< 11728 1726882205.04744: stderr chunk (state=3): >>><<< 11728 1726882205.04762: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "encap2+3 3", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy"], "start": "2024-09-20 21:30:05.027682", "end": "2024-09-20 21:30:05.030743", "delta": "0:00:00.003061", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/xmit_hash_policy", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882205.04790: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/xmit_hash_policy', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882205.04795: _low_level_execute_command(): starting 11728 1726882205.04917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882204.721003-12864-122207557582793/ > /dev/null 2>&1 && sleep 0' 11728 1726882205.05701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882205.05704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.05706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.05708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882205.05710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882205.05712: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882205.05714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.05720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882205.05722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882205.05725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882205.05726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.05728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.05730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882205.05732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882205.05734: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882205.05736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.05738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.05800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.05808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.05865: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.07674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.07683: stdout chunk (state=3): >>><<< 11728 1726882205.07699: stderr chunk (state=3): >>><<< 11728 1726882205.07718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.07727: handler run complete 11728 1726882205.07751: Evaluated conditional (False): False 11728 1726882205.07895: variable 'bond_opt' from source: unknown 11728 1726882205.07912: variable 'result' from source: unknown 11728 1726882205.08099: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882205.08102: attempt loop complete, returning result 11728 1726882205.08104: variable 'bond_opt' from source: unknown 11728 1726882205.08106: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'xmit_hash_policy', 'value': 'encap2+3'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "xmit_hash_policy", "value": "encap2+3" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/xmit_hash_policy" ], "delta": "0:00:00.003061", "end": "2024-09-20 21:30:05.030743", "rc": 0, "start": "2024-09-20 21:30:05.027682" } STDOUT: encap2+3 3 11728 1726882205.08204: dumping result to json 11728 1726882205.08319: done dumping result, returning 11728 1726882205.08322: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [12673a56-9f93-5c28-a762-000000000400] 11728 1726882205.08324: sending task result for task 12673a56-9f93-5c28-a762-000000000400 11728 1726882205.08699: done sending task result for task 12673a56-9f93-5c28-a762-000000000400 11728 1726882205.08703: WORKER PROCESS EXITING 11728 1726882205.09427: no more pending results, returning what we have 11728 1726882205.09431: results queue empty 11728 1726882205.09431: checking for any_errors_fatal 11728 1726882205.09437: done checking for any_errors_fatal 11728 1726882205.09437: checking for max_fail_percentage 11728 1726882205.09439: done checking for max_fail_percentage 11728 1726882205.09440: checking to see if all hosts have failed and the running result is not ok 11728 1726882205.09441: done checking to see if all hosts have failed 11728 1726882205.09441: getting the remaining hosts for this loop 11728 1726882205.09443: done getting the remaining hosts for this loop 11728 1726882205.09446: getting the next task for host managed_node3 11728 1726882205.09452: done getting next task for host managed_node3 11728 1726882205.09454: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11728 1726882205.09457: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882205.09461: getting variables 11728 1726882205.09462: in VariableManager get_vars() 11728 1726882205.09487: Calling all_inventory to load vars for managed_node3 11728 1726882205.09490: Calling groups_inventory to load vars for managed_node3 11728 1726882205.09495: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.09505: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.09508: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.09511: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.11028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.12528: done with get_vars() 11728 1726882205.12553: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 21:30:05 -0400 (0:00:06.900) 0:00:29.978 ****** 11728 1726882205.12651: entering _queue_task() for managed_node3/include_tasks 11728 1726882205.12989: worker is 1 (out of 1 available) 11728 1726882205.13004: exiting _queue_task() for managed_node3/include_tasks 11728 1726882205.13016: done queuing things up, now waiting for results queue to drain 11728 1726882205.13017: waiting for pending results... 11728 1726882205.13415: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 11728 1726882205.13421: in run() - task 12673a56-9f93-5c28-a762-000000000402 11728 1726882205.13442: variable 'ansible_search_path' from source: unknown 11728 1726882205.13451: variable 'ansible_search_path' from source: unknown 11728 1726882205.13495: calling self._execute() 11728 1726882205.13595: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.13610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.13630: variable 'omit' from source: magic vars 11728 1726882205.13990: variable 'ansible_distribution_major_version' from source: facts 11728 1726882205.14011: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882205.14024: _execute() done 11728 1726882205.14032: dumping result to json 11728 1726882205.14040: done dumping result, returning 11728 1726882205.14051: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [12673a56-9f93-5c28-a762-000000000402] 11728 1726882205.14063: sending task result for task 12673a56-9f93-5c28-a762-000000000402 11728 1726882205.14330: no more pending results, returning what we have 11728 1726882205.14336: in VariableManager get_vars() 11728 1726882205.14375: Calling all_inventory to load vars for managed_node3 11728 1726882205.14378: Calling groups_inventory to load vars for managed_node3 11728 1726882205.14382: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.14398: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.14402: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.14406: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.15007: done sending task result for task 12673a56-9f93-5c28-a762-000000000402 11728 1726882205.15010: WORKER PROCESS EXITING 11728 1726882205.15932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.17453: done with get_vars() 11728 1726882205.17472: variable 'ansible_search_path' from source: unknown 11728 1726882205.17474: variable 'ansible_search_path' from source: unknown 11728 1726882205.17483: variable 'item' from source: include params 11728 1726882205.17591: variable 'item' from source: include params 11728 1726882205.17629: we have included files to process 11728 1726882205.17631: generating all_blocks data 11728 1726882205.17633: done generating all_blocks data 11728 1726882205.17638: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882205.17639: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882205.17642: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882205.17878: done processing included file 11728 1726882205.17881: iterating over new_blocks loaded from include file 11728 1726882205.17882: in VariableManager get_vars() 11728 1726882205.17899: done with get_vars() 11728 1726882205.17901: filtering new block on tags 11728 1726882205.17928: done filtering new block on tags 11728 1726882205.17930: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 11728 1726882205.17936: extending task lists for all hosts with included blocks 11728 1726882205.18149: done extending task lists 11728 1726882205.18150: done processing included files 11728 1726882205.18151: results queue empty 11728 1726882205.18152: checking for any_errors_fatal 11728 1726882205.18164: done checking for any_errors_fatal 11728 1726882205.18165: checking for max_fail_percentage 11728 1726882205.18166: done checking for max_fail_percentage 11728 1726882205.18167: checking to see if all hosts have failed and the running result is not ok 11728 1726882205.18167: done checking to see if all hosts have failed 11728 1726882205.18168: getting the remaining hosts for this loop 11728 1726882205.18169: done getting the remaining hosts for this loop 11728 1726882205.18172: getting the next task for host managed_node3 11728 1726882205.18176: done getting next task for host managed_node3 11728 1726882205.18177: ^ task is: TASK: ** TEST check IPv4 11728 1726882205.18180: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882205.18182: getting variables 11728 1726882205.18183: in VariableManager get_vars() 11728 1726882205.18192: Calling all_inventory to load vars for managed_node3 11728 1726882205.18196: Calling groups_inventory to load vars for managed_node3 11728 1726882205.18198: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.18204: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.18206: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.18209: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.18966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.19806: done with get_vars() 11728 1726882205.19820: done getting variables 11728 1726882205.19848: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 21:30:05 -0400 (0:00:00.072) 0:00:30.051 ****** 11728 1726882205.19869: entering _queue_task() for managed_node3/command 11728 1726882205.20125: worker is 1 (out of 1 available) 11728 1726882205.20138: exiting _queue_task() for managed_node3/command 11728 1726882205.20159: done queuing things up, now waiting for results queue to drain 11728 1726882205.20160: waiting for pending results... 11728 1726882205.20342: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 11728 1726882205.20416: in run() - task 12673a56-9f93-5c28-a762-000000000631 11728 1726882205.20429: variable 'ansible_search_path' from source: unknown 11728 1726882205.20438: variable 'ansible_search_path' from source: unknown 11728 1726882205.20703: calling self._execute() 11728 1726882205.20711: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.20715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.20718: variable 'omit' from source: magic vars 11728 1726882205.20875: variable 'ansible_distribution_major_version' from source: facts 11728 1726882205.20884: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882205.20890: variable 'omit' from source: magic vars 11728 1726882205.20929: variable 'omit' from source: magic vars 11728 1726882205.21054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882205.23080: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882205.23128: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882205.23155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882205.23179: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882205.23203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882205.23262: variable 'interface' from source: include params 11728 1726882205.23266: variable 'controller_device' from source: play vars 11728 1726882205.23317: variable 'controller_device' from source: play vars 11728 1726882205.23336: variable 'omit' from source: magic vars 11728 1726882205.23359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882205.23381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882205.23399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882205.23411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882205.23420: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882205.23445: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882205.23448: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.23451: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.23514: Set connection var ansible_connection to ssh 11728 1726882205.23523: Set connection var ansible_shell_executable to /bin/sh 11728 1726882205.23528: Set connection var ansible_timeout to 10 11728 1726882205.23531: Set connection var ansible_shell_type to sh 11728 1726882205.23538: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882205.23540: Set connection var ansible_pipelining to False 11728 1726882205.23560: variable 'ansible_shell_executable' from source: unknown 11728 1726882205.23564: variable 'ansible_connection' from source: unknown 11728 1726882205.23566: variable 'ansible_module_compression' from source: unknown 11728 1726882205.23568: variable 'ansible_shell_type' from source: unknown 11728 1726882205.23571: variable 'ansible_shell_executable' from source: unknown 11728 1726882205.23573: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.23575: variable 'ansible_pipelining' from source: unknown 11728 1726882205.23577: variable 'ansible_timeout' from source: unknown 11728 1726882205.23582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.23656: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882205.23667: variable 'omit' from source: magic vars 11728 1726882205.23672: starting attempt loop 11728 1726882205.23674: running the handler 11728 1726882205.23687: _low_level_execute_command(): starting 11728 1726882205.23694: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882205.24184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.24188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.24191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.24197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.24246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.24250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.24319: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.25985: stdout chunk (state=3): >>>/root <<< 11728 1726882205.26087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.26119: stderr chunk (state=3): >>><<< 11728 1726882205.26123: stdout chunk (state=3): >>><<< 11728 1726882205.26144: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.26155: _low_level_execute_command(): starting 11728 1726882205.26159: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407 `" && echo ansible-tmp-1726882205.2614288-13306-133124729458407="` echo /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407 `" ) && sleep 0' 11728 1726882205.26606: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.26609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882205.26612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.26614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.26616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882205.26618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.26669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.26674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.26677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.26721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.29000: stdout chunk (state=3): >>>ansible-tmp-1726882205.2614288-13306-133124729458407=/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407 <<< 11728 1726882205.29003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.29005: stdout chunk (state=3): >>><<< 11728 1726882205.29007: stderr chunk (state=3): >>><<< 11728 1726882205.29009: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882205.2614288-13306-133124729458407=/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.29011: variable 'ansible_module_compression' from source: unknown 11728 1726882205.29013: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882205.29014: variable 'ansible_facts' from source: unknown 11728 1726882205.29032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py 11728 1726882205.29421: Sending initial data 11728 1726882205.29607: Sent initial data (156 bytes) 11728 1726882205.30303: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882205.30592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.30610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.30680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.32197: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11728 1726882205.32206: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882205.32238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882205.32283: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp228pzr_c /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py <<< 11728 1726882205.32289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py" <<< 11728 1726882205.32327: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11728 1726882205.32331: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp228pzr_c" to remote "/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py" <<< 11728 1726882205.32865: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.32897: stderr chunk (state=3): >>><<< 11728 1726882205.32901: stdout chunk (state=3): >>><<< 11728 1726882205.32945: done transferring module to remote 11728 1726882205.32954: _low_level_execute_command(): starting 11728 1726882205.32960: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/ /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py && sleep 0' 11728 1726882205.33390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.33395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882205.33398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.33402: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.33404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.33450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.33454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.33506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.35233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.35256: stderr chunk (state=3): >>><<< 11728 1726882205.35259: stdout chunk (state=3): >>><<< 11728 1726882205.35274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.35279: _low_level_execute_command(): starting 11728 1726882205.35284: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/AnsiballZ_command.py && sleep 0' 11728 1726882205.35711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.35714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882205.35716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.35719: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882205.35721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.35767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.35770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.35823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.51236: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.51/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 229sec preferred_lft 229sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:05.506953", "end": "2024-09-20 21:30:05.510608", "delta": "0:00:00.003655", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882205.52722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882205.52748: stderr chunk (state=3): >>><<< 11728 1726882205.52751: stdout chunk (state=3): >>><<< 11728 1726882205.52767: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.51/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 229sec preferred_lft 229sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:05.506953", "end": "2024-09-20 21:30:05.510608", "delta": "0:00:00.003655", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882205.52797: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882205.52806: _low_level_execute_command(): starting 11728 1726882205.52812: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882205.2614288-13306-133124729458407/ > /dev/null 2>&1 && sleep 0' 11728 1726882205.53254: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.53258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882205.53260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.53262: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.53264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.53309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.53321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.53373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.55159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.55185: stderr chunk (state=3): >>><<< 11728 1726882205.55188: stdout chunk (state=3): >>><<< 11728 1726882205.55208: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.55214: handler run complete 11728 1726882205.55233: Evaluated conditional (False): False 11728 1726882205.55356: variable 'address' from source: include params 11728 1726882205.55360: variable 'result' from source: set_fact 11728 1726882205.55373: Evaluated conditional (address in result.stdout): True 11728 1726882205.55382: attempt loop complete, returning result 11728 1726882205.55385: _execute() done 11728 1726882205.55387: dumping result to json 11728 1726882205.55392: done dumping result, returning 11728 1726882205.55404: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [12673a56-9f93-5c28-a762-000000000631] 11728 1726882205.55407: sending task result for task 12673a56-9f93-5c28-a762-000000000631 11728 1726882205.55499: done sending task result for task 12673a56-9f93-5c28-a762-000000000631 11728 1726882205.55502: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003655", "end": "2024-09-20 21:30:05.510608", "rc": 0, "start": "2024-09-20 21:30:05.506953" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.51/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 229sec preferred_lft 229sec 11728 1726882205.55583: no more pending results, returning what we have 11728 1726882205.55586: results queue empty 11728 1726882205.55587: checking for any_errors_fatal 11728 1726882205.55589: done checking for any_errors_fatal 11728 1726882205.55589: checking for max_fail_percentage 11728 1726882205.55591: done checking for max_fail_percentage 11728 1726882205.55592: checking to see if all hosts have failed and the running result is not ok 11728 1726882205.55592: done checking to see if all hosts have failed 11728 1726882205.55595: getting the remaining hosts for this loop 11728 1726882205.55597: done getting the remaining hosts for this loop 11728 1726882205.55600: getting the next task for host managed_node3 11728 1726882205.55608: done getting next task for host managed_node3 11728 1726882205.55611: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11728 1726882205.55614: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882205.55619: getting variables 11728 1726882205.55621: in VariableManager get_vars() 11728 1726882205.55651: Calling all_inventory to load vars for managed_node3 11728 1726882205.55654: Calling groups_inventory to load vars for managed_node3 11728 1726882205.55657: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.55667: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.55670: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.55673: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.56547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.57379: done with get_vars() 11728 1726882205.57398: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 21:30:05 -0400 (0:00:00.375) 0:00:30.427 ****** 11728 1726882205.57463: entering _queue_task() for managed_node3/include_tasks 11728 1726882205.57703: worker is 1 (out of 1 available) 11728 1726882205.57719: exiting _queue_task() for managed_node3/include_tasks 11728 1726882205.57731: done queuing things up, now waiting for results queue to drain 11728 1726882205.57732: waiting for pending results... 11728 1726882205.57906: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 11728 1726882205.57972: in run() - task 12673a56-9f93-5c28-a762-000000000403 11728 1726882205.57983: variable 'ansible_search_path' from source: unknown 11728 1726882205.57987: variable 'ansible_search_path' from source: unknown 11728 1726882205.58020: calling self._execute() 11728 1726882205.58090: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.58095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.58105: variable 'omit' from source: magic vars 11728 1726882205.58368: variable 'ansible_distribution_major_version' from source: facts 11728 1726882205.58378: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882205.58384: _execute() done 11728 1726882205.58386: dumping result to json 11728 1726882205.58390: done dumping result, returning 11728 1726882205.58400: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [12673a56-9f93-5c28-a762-000000000403] 11728 1726882205.58405: sending task result for task 12673a56-9f93-5c28-a762-000000000403 11728 1726882205.58490: done sending task result for task 12673a56-9f93-5c28-a762-000000000403 11728 1726882205.58496: WORKER PROCESS EXITING 11728 1726882205.58525: no more pending results, returning what we have 11728 1726882205.58530: in VariableManager get_vars() 11728 1726882205.58564: Calling all_inventory to load vars for managed_node3 11728 1726882205.58567: Calling groups_inventory to load vars for managed_node3 11728 1726882205.58570: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.58582: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.58585: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.58587: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.59328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.60141: done with get_vars() 11728 1726882205.60155: variable 'ansible_search_path' from source: unknown 11728 1726882205.60156: variable 'ansible_search_path' from source: unknown 11728 1726882205.60162: variable 'item' from source: include params 11728 1726882205.60236: variable 'item' from source: include params 11728 1726882205.60259: we have included files to process 11728 1726882205.60259: generating all_blocks data 11728 1726882205.60261: done generating all_blocks data 11728 1726882205.60264: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882205.60265: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882205.60266: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882205.60420: done processing included file 11728 1726882205.60422: iterating over new_blocks loaded from include file 11728 1726882205.60423: in VariableManager get_vars() 11728 1726882205.60433: done with get_vars() 11728 1726882205.60434: filtering new block on tags 11728 1726882205.60450: done filtering new block on tags 11728 1726882205.60451: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 11728 1726882205.60455: extending task lists for all hosts with included blocks 11728 1726882205.60646: done extending task lists 11728 1726882205.60647: done processing included files 11728 1726882205.60648: results queue empty 11728 1726882205.60648: checking for any_errors_fatal 11728 1726882205.60652: done checking for any_errors_fatal 11728 1726882205.60652: checking for max_fail_percentage 11728 1726882205.60653: done checking for max_fail_percentage 11728 1726882205.60653: checking to see if all hosts have failed and the running result is not ok 11728 1726882205.60654: done checking to see if all hosts have failed 11728 1726882205.60654: getting the remaining hosts for this loop 11728 1726882205.60655: done getting the remaining hosts for this loop 11728 1726882205.60656: getting the next task for host managed_node3 11728 1726882205.60659: done getting next task for host managed_node3 11728 1726882205.60660: ^ task is: TASK: ** TEST check IPv6 11728 1726882205.60662: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882205.60664: getting variables 11728 1726882205.60664: in VariableManager get_vars() 11728 1726882205.60670: Calling all_inventory to load vars for managed_node3 11728 1726882205.60671: Calling groups_inventory to load vars for managed_node3 11728 1726882205.60673: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.60676: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.60678: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.60679: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.61356: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882205.62161: done with get_vars() 11728 1726882205.62175: done getting variables 11728 1726882205.62208: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 21:30:05 -0400 (0:00:00.047) 0:00:30.474 ****** 11728 1726882205.62230: entering _queue_task() for managed_node3/command 11728 1726882205.62468: worker is 1 (out of 1 available) 11728 1726882205.62482: exiting _queue_task() for managed_node3/command 11728 1726882205.62496: done queuing things up, now waiting for results queue to drain 11728 1726882205.62497: waiting for pending results... 11728 1726882205.62671: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 11728 1726882205.62747: in run() - task 12673a56-9f93-5c28-a762-000000000652 11728 1726882205.62760: variable 'ansible_search_path' from source: unknown 11728 1726882205.62763: variable 'ansible_search_path' from source: unknown 11728 1726882205.62791: calling self._execute() 11728 1726882205.62866: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.62872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.62882: variable 'omit' from source: magic vars 11728 1726882205.63141: variable 'ansible_distribution_major_version' from source: facts 11728 1726882205.63152: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882205.63156: variable 'omit' from source: magic vars 11728 1726882205.63188: variable 'omit' from source: magic vars 11728 1726882205.63304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882205.64711: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882205.64755: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882205.64781: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882205.64812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882205.64833: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882205.64885: variable 'controller_device' from source: play vars 11728 1726882205.64904: variable 'omit' from source: magic vars 11728 1726882205.64927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882205.64949: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882205.64963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882205.64975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882205.64984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882205.65013: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882205.65016: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.65018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.65078: Set connection var ansible_connection to ssh 11728 1726882205.65085: Set connection var ansible_shell_executable to /bin/sh 11728 1726882205.65090: Set connection var ansible_timeout to 10 11728 1726882205.65094: Set connection var ansible_shell_type to sh 11728 1726882205.65103: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882205.65108: Set connection var ansible_pipelining to False 11728 1726882205.65126: variable 'ansible_shell_executable' from source: unknown 11728 1726882205.65129: variable 'ansible_connection' from source: unknown 11728 1726882205.65131: variable 'ansible_module_compression' from source: unknown 11728 1726882205.65134: variable 'ansible_shell_type' from source: unknown 11728 1726882205.65136: variable 'ansible_shell_executable' from source: unknown 11728 1726882205.65139: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882205.65143: variable 'ansible_pipelining' from source: unknown 11728 1726882205.65146: variable 'ansible_timeout' from source: unknown 11728 1726882205.65150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882205.65225: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882205.65229: variable 'omit' from source: magic vars 11728 1726882205.65235: starting attempt loop 11728 1726882205.65238: running the handler 11728 1726882205.65250: _low_level_execute_command(): starting 11728 1726882205.65255: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882205.65742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.65746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.65749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.65751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882205.65753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.65803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.65807: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.65820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.65868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.67484: stdout chunk (state=3): >>>/root <<< 11728 1726882205.67585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.67617: stderr chunk (state=3): >>><<< 11728 1726882205.67622: stdout chunk (state=3): >>><<< 11728 1726882205.67640: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.67649: _low_level_execute_command(): starting 11728 1726882205.67658: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093 `" && echo ansible-tmp-1726882205.6763837-13315-83036134912093="` echo /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093 `" ) && sleep 0' 11728 1726882205.68072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.68076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882205.68078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.68081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.68083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882205.68085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.68132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.68135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.68184: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.70026: stdout chunk (state=3): >>>ansible-tmp-1726882205.6763837-13315-83036134912093=/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093 <<< 11728 1726882205.70132: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.70155: stderr chunk (state=3): >>><<< 11728 1726882205.70158: stdout chunk (state=3): >>><<< 11728 1726882205.70172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882205.6763837-13315-83036134912093=/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.70195: variable 'ansible_module_compression' from source: unknown 11728 1726882205.70231: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882205.70256: variable 'ansible_facts' from source: unknown 11728 1726882205.70315: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py 11728 1726882205.70409: Sending initial data 11728 1726882205.70412: Sent initial data (155 bytes) 11728 1726882205.70826: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.70829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882205.70832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.70834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.70836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.70881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.70885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.70936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.72449: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882205.72496: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882205.72538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9w8ylcrc /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py <<< 11728 1726882205.72542: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py" <<< 11728 1726882205.72583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9w8ylcrc" to remote "/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py" <<< 11728 1726882205.72587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py" <<< 11728 1726882205.73115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.73159: stderr chunk (state=3): >>><<< 11728 1726882205.73163: stdout chunk (state=3): >>><<< 11728 1726882205.73216: done transferring module to remote 11728 1726882205.73227: _low_level_execute_command(): starting 11728 1726882205.73231: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/ /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py && sleep 0' 11728 1726882205.73672: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.73675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.73677: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882205.73679: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882205.73681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.73729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.73732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.73784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.75472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.75500: stderr chunk (state=3): >>><<< 11728 1726882205.75503: stdout chunk (state=3): >>><<< 11728 1726882205.75515: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.75518: _low_level_execute_command(): starting 11728 1726882205.75522: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/AnsiballZ_command.py && sleep 0' 11728 1726882205.75937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.75941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.75943: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882205.75945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882205.75947: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882205.75999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882205.76002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.76007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.76059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.91471: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::129/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::c038:23ff:fe98:9e65/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::c038:23ff:fe98:9e65/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:05.909199", "end": "2024-09-20 21:30:05.912965", "delta": "0:00:00.003766", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882205.93102: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882205.93106: stdout chunk (state=3): >>><<< 11728 1726882205.93109: stderr chunk (state=3): >>><<< 11728 1726882205.93112: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::129/128 scope global dynamic noprefixroute \n valid_lft 227sec preferred_lft 227sec\n inet6 2001:db8::c038:23ff:fe98:9e65/64 scope global dynamic noprefixroute \n valid_lft 1794sec preferred_lft 1794sec\n inet6 fe80::c038:23ff:fe98:9e65/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:05.909199", "end": "2024-09-20 21:30:05.912965", "delta": "0:00:00.003766", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882205.93121: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882205.93123: _low_level_execute_command(): starting 11728 1726882205.93126: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882205.6763837-13315-83036134912093/ > /dev/null 2>&1 && sleep 0' 11728 1726882205.93734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882205.93749: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882205.93801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882205.93813: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882205.93898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882205.93920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882205.94002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882205.95847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882205.95879: stdout chunk (state=3): >>><<< 11728 1726882205.95882: stderr chunk (state=3): >>><<< 11728 1726882205.95905: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882205.96101: handler run complete 11728 1726882205.96104: Evaluated conditional (False): False 11728 1726882205.96141: variable 'address' from source: include params 11728 1726882205.96152: variable 'result' from source: set_fact 11728 1726882205.96173: Evaluated conditional (address in result.stdout): True 11728 1726882205.96196: attempt loop complete, returning result 11728 1726882205.96215: _execute() done 11728 1726882205.96229: dumping result to json 11728 1726882205.96237: done dumping result, returning 11728 1726882205.96248: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [12673a56-9f93-5c28-a762-000000000652] 11728 1726882205.96255: sending task result for task 12673a56-9f93-5c28-a762-000000000652 11728 1726882205.96802: done sending task result for task 12673a56-9f93-5c28-a762-000000000652 11728 1726882205.96805: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003766", "end": "2024-09-20 21:30:05.912965", "rc": 0, "start": "2024-09-20 21:30:05.909199" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::129/128 scope global dynamic noprefixroute valid_lft 227sec preferred_lft 227sec inet6 2001:db8::c038:23ff:fe98:9e65/64 scope global dynamic noprefixroute valid_lft 1794sec preferred_lft 1794sec inet6 fe80::c038:23ff:fe98:9e65/64 scope link noprefixroute valid_lft forever preferred_lft forever 11728 1726882205.96880: no more pending results, returning what we have 11728 1726882205.96883: results queue empty 11728 1726882205.96884: checking for any_errors_fatal 11728 1726882205.96886: done checking for any_errors_fatal 11728 1726882205.96886: checking for max_fail_percentage 11728 1726882205.96888: done checking for max_fail_percentage 11728 1726882205.96889: checking to see if all hosts have failed and the running result is not ok 11728 1726882205.96890: done checking to see if all hosts have failed 11728 1726882205.96891: getting the remaining hosts for this loop 11728 1726882205.96892: done getting the remaining hosts for this loop 11728 1726882205.96898: getting the next task for host managed_node3 11728 1726882205.96907: done getting next task for host managed_node3 11728 1726882205.96911: ^ task is: TASK: Conditional asserts 11728 1726882205.96913: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882205.96918: getting variables 11728 1726882205.96920: in VariableManager get_vars() 11728 1726882205.96950: Calling all_inventory to load vars for managed_node3 11728 1726882205.96954: Calling groups_inventory to load vars for managed_node3 11728 1726882205.96957: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882205.96969: Calling all_plugins_play to load vars for managed_node3 11728 1726882205.96972: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882205.96975: Calling groups_plugins_play to load vars for managed_node3 11728 1726882205.99090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.01052: done with get_vars() 11728 1726882206.01075: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:30:06 -0400 (0:00:00.389) 0:00:30.864 ****** 11728 1726882206.01178: entering _queue_task() for managed_node3/include_tasks 11728 1726882206.01624: worker is 1 (out of 1 available) 11728 1726882206.01637: exiting _queue_task() for managed_node3/include_tasks 11728 1726882206.01650: done queuing things up, now waiting for results queue to drain 11728 1726882206.01652: waiting for pending results... 11728 1726882206.02314: running TaskExecutor() for managed_node3/TASK: Conditional asserts 11728 1726882206.02520: in run() - task 12673a56-9f93-5c28-a762-00000000008e 11728 1726882206.02525: variable 'ansible_search_path' from source: unknown 11728 1726882206.02529: variable 'ansible_search_path' from source: unknown 11728 1726882206.02883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882206.05058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882206.05146: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882206.05188: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882206.05232: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882206.05267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882206.05354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882206.05389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882206.05424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882206.05475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882206.05497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882206.05675: dumping result to json 11728 1726882206.05678: done dumping result, returning 11728 1726882206.05680: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [12673a56-9f93-5c28-a762-00000000008e] 11728 1726882206.05682: sending task result for task 12673a56-9f93-5c28-a762-00000000008e skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 11728 1726882206.05948: no more pending results, returning what we have 11728 1726882206.05952: results queue empty 11728 1726882206.05954: checking for any_errors_fatal 11728 1726882206.05965: done checking for any_errors_fatal 11728 1726882206.05966: checking for max_fail_percentage 11728 1726882206.05967: done checking for max_fail_percentage 11728 1726882206.05968: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.05969: done checking to see if all hosts have failed 11728 1726882206.05970: getting the remaining hosts for this loop 11728 1726882206.05971: done getting the remaining hosts for this loop 11728 1726882206.05975: getting the next task for host managed_node3 11728 1726882206.05982: done getting next task for host managed_node3 11728 1726882206.05985: ^ task is: TASK: Success in test '{{ lsr_description }}' 11728 1726882206.05988: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.05992: getting variables 11728 1726882206.05995: in VariableManager get_vars() 11728 1726882206.06030: Calling all_inventory to load vars for managed_node3 11728 1726882206.06033: Calling groups_inventory to load vars for managed_node3 11728 1726882206.06036: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.06048: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.06051: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.06054: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.06576: done sending task result for task 12673a56-9f93-5c28-a762-00000000008e 11728 1726882206.06579: WORKER PROCESS EXITING 11728 1726882206.07643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.10332: done with get_vars() 11728 1726882206.10362: done getting variables 11728 1726882206.10627: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882206.10740: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:30:06 -0400 (0:00:00.095) 0:00:30.960 ****** 11728 1726882206.10770: entering _queue_task() for managed_node3/debug 11728 1726882206.11726: worker is 1 (out of 1 available) 11728 1726882206.11738: exiting _queue_task() for managed_node3/debug 11728 1726882206.11750: done queuing things up, now waiting for results queue to drain 11728 1726882206.11752: waiting for pending results... 11728 1726882206.12274: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11728 1726882206.12305: in run() - task 12673a56-9f93-5c28-a762-00000000008f 11728 1726882206.12325: variable 'ansible_search_path' from source: unknown 11728 1726882206.12332: variable 'ansible_search_path' from source: unknown 11728 1726882206.12379: calling self._execute() 11728 1726882206.12499: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.12511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.12526: variable 'omit' from source: magic vars 11728 1726882206.12896: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.12919: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.13000: variable 'omit' from source: magic vars 11728 1726882206.13004: variable 'omit' from source: magic vars 11728 1726882206.13074: variable 'lsr_description' from source: include params 11728 1726882206.13101: variable 'omit' from source: magic vars 11728 1726882206.13148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882206.13186: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882206.13221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882206.13247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882206.13262: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882206.13300: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882206.13309: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.13316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.13413: Set connection var ansible_connection to ssh 11728 1726882206.13429: Set connection var ansible_shell_executable to /bin/sh 11728 1726882206.13450: Set connection var ansible_timeout to 10 11728 1726882206.13453: Set connection var ansible_shell_type to sh 11728 1726882206.13461: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882206.13620: Set connection var ansible_pipelining to False 11728 1726882206.13624: variable 'ansible_shell_executable' from source: unknown 11728 1726882206.13626: variable 'ansible_connection' from source: unknown 11728 1726882206.13629: variable 'ansible_module_compression' from source: unknown 11728 1726882206.13631: variable 'ansible_shell_type' from source: unknown 11728 1726882206.13633: variable 'ansible_shell_executable' from source: unknown 11728 1726882206.13635: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.13636: variable 'ansible_pipelining' from source: unknown 11728 1726882206.13638: variable 'ansible_timeout' from source: unknown 11728 1726882206.13640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.13707: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882206.13725: variable 'omit' from source: magic vars 11728 1726882206.13735: starting attempt loop 11728 1726882206.13742: running the handler 11728 1726882206.13799: handler run complete 11728 1726882206.13819: attempt loop complete, returning result 11728 1726882206.13827: _execute() done 11728 1726882206.13834: dumping result to json 11728 1726882206.13842: done dumping result, returning 11728 1726882206.13854: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [12673a56-9f93-5c28-a762-00000000008f] 11728 1726882206.13871: sending task result for task 12673a56-9f93-5c28-a762-00000000008f ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11728 1726882206.14023: no more pending results, returning what we have 11728 1726882206.14027: results queue empty 11728 1726882206.14028: checking for any_errors_fatal 11728 1726882206.14035: done checking for any_errors_fatal 11728 1726882206.14036: checking for max_fail_percentage 11728 1726882206.14038: done checking for max_fail_percentage 11728 1726882206.14039: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.14039: done checking to see if all hosts have failed 11728 1726882206.14040: getting the remaining hosts for this loop 11728 1726882206.14042: done getting the remaining hosts for this loop 11728 1726882206.14045: getting the next task for host managed_node3 11728 1726882206.14053: done getting next task for host managed_node3 11728 1726882206.14057: ^ task is: TASK: Cleanup 11728 1726882206.14061: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.14066: getting variables 11728 1726882206.14067: in VariableManager get_vars() 11728 1726882206.14103: Calling all_inventory to load vars for managed_node3 11728 1726882206.14107: Calling groups_inventory to load vars for managed_node3 11728 1726882206.14110: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.14122: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.14125: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.14129: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.14908: done sending task result for task 12673a56-9f93-5c28-a762-00000000008f 11728 1726882206.14912: WORKER PROCESS EXITING 11728 1726882206.19949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.21463: done with get_vars() 11728 1726882206.21487: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:30:06 -0400 (0:00:00.107) 0:00:31.068 ****** 11728 1726882206.21569: entering _queue_task() for managed_node3/include_tasks 11728 1726882206.21919: worker is 1 (out of 1 available) 11728 1726882206.21931: exiting _queue_task() for managed_node3/include_tasks 11728 1726882206.21943: done queuing things up, now waiting for results queue to drain 11728 1726882206.21945: waiting for pending results... 11728 1726882206.22234: running TaskExecutor() for managed_node3/TASK: Cleanup 11728 1726882206.22350: in run() - task 12673a56-9f93-5c28-a762-000000000093 11728 1726882206.22371: variable 'ansible_search_path' from source: unknown 11728 1726882206.22378: variable 'ansible_search_path' from source: unknown 11728 1726882206.22435: variable 'lsr_cleanup' from source: include params 11728 1726882206.22650: variable 'lsr_cleanup' from source: include params 11728 1726882206.22726: variable 'omit' from source: magic vars 11728 1726882206.23002: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.23007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.23009: variable 'omit' from source: magic vars 11728 1726882206.23174: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.23190: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.23207: variable 'item' from source: unknown 11728 1726882206.23282: variable 'item' from source: unknown 11728 1726882206.23323: variable 'item' from source: unknown 11728 1726882206.23397: variable 'item' from source: unknown 11728 1726882206.23813: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.23817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.23819: variable 'omit' from source: magic vars 11728 1726882206.23821: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.23824: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.23826: variable 'item' from source: unknown 11728 1726882206.23870: variable 'item' from source: unknown 11728 1726882206.23907: variable 'item' from source: unknown 11728 1726882206.23975: variable 'item' from source: unknown 11728 1726882206.24126: dumping result to json 11728 1726882206.24129: done dumping result, returning 11728 1726882206.24132: done running TaskExecutor() for managed_node3/TASK: Cleanup [12673a56-9f93-5c28-a762-000000000093] 11728 1726882206.24134: sending task result for task 12673a56-9f93-5c28-a762-000000000093 11728 1726882206.24180: done sending task result for task 12673a56-9f93-5c28-a762-000000000093 11728 1726882206.24183: WORKER PROCESS EXITING 11728 1726882206.24254: no more pending results, returning what we have 11728 1726882206.24260: in VariableManager get_vars() 11728 1726882206.24303: Calling all_inventory to load vars for managed_node3 11728 1726882206.24306: Calling groups_inventory to load vars for managed_node3 11728 1726882206.24310: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.24323: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.24327: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.24330: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.25930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.27468: done with get_vars() 11728 1726882206.27486: variable 'ansible_search_path' from source: unknown 11728 1726882206.27488: variable 'ansible_search_path' from source: unknown 11728 1726882206.27532: variable 'ansible_search_path' from source: unknown 11728 1726882206.27534: variable 'ansible_search_path' from source: unknown 11728 1726882206.27562: we have included files to process 11728 1726882206.27563: generating all_blocks data 11728 1726882206.27565: done generating all_blocks data 11728 1726882206.27570: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882206.27571: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882206.27573: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882206.27807: in VariableManager get_vars() 11728 1726882206.27828: done with get_vars() 11728 1726882206.27834: variable 'omit' from source: magic vars 11728 1726882206.27875: variable 'omit' from source: magic vars 11728 1726882206.27932: in VariableManager get_vars() 11728 1726882206.27944: done with get_vars() 11728 1726882206.27969: in VariableManager get_vars() 11728 1726882206.27984: done with get_vars() 11728 1726882206.28025: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11728 1726882206.28190: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11728 1726882206.28272: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11728 1726882206.28661: in VariableManager get_vars() 11728 1726882206.28679: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882206.31083: done processing included file 11728 1726882206.31085: iterating over new_blocks loaded from include file 11728 1726882206.31087: in VariableManager get_vars() 11728 1726882206.31437: done with get_vars() 11728 1726882206.31440: filtering new block on tags 11728 1726882206.32947: done filtering new block on tags 11728 1726882206.32952: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 11728 1726882206.32958: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882206.32959: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882206.32964: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882206.34732: done processing included file 11728 1726882206.34736: iterating over new_blocks loaded from include file 11728 1726882206.34738: in VariableManager get_vars() 11728 1726882206.34769: done with get_vars() 11728 1726882206.34776: filtering new block on tags 11728 1726882206.34871: done filtering new block on tags 11728 1726882206.34875: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11728 1726882206.34901: extending task lists for all hosts with included blocks 11728 1726882206.37851: done extending task lists 11728 1726882206.37853: done processing included files 11728 1726882206.37854: results queue empty 11728 1726882206.37855: checking for any_errors_fatal 11728 1726882206.37860: done checking for any_errors_fatal 11728 1726882206.37861: checking for max_fail_percentage 11728 1726882206.37862: done checking for max_fail_percentage 11728 1726882206.37863: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.37863: done checking to see if all hosts have failed 11728 1726882206.37864: getting the remaining hosts for this loop 11728 1726882206.37865: done getting the remaining hosts for this loop 11728 1726882206.37868: getting the next task for host managed_node3 11728 1726882206.37873: done getting next task for host managed_node3 11728 1726882206.37881: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882206.37885: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.37898: getting variables 11728 1726882206.37900: in VariableManager get_vars() 11728 1726882206.37915: Calling all_inventory to load vars for managed_node3 11728 1726882206.37917: Calling groups_inventory to load vars for managed_node3 11728 1726882206.37919: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.37925: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.37927: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.37930: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.40443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.43676: done with get_vars() 11728 1726882206.43713: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:30:06 -0400 (0:00:00.222) 0:00:31.290 ****** 11728 1726882206.43999: entering _queue_task() for managed_node3/include_tasks 11728 1726882206.44570: worker is 1 (out of 1 available) 11728 1726882206.44582: exiting _queue_task() for managed_node3/include_tasks 11728 1726882206.45000: done queuing things up, now waiting for results queue to drain 11728 1726882206.45002: waiting for pending results... 11728 1726882206.45244: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882206.45604: in run() - task 12673a56-9f93-5c28-a762-000000000693 11728 1726882206.45608: variable 'ansible_search_path' from source: unknown 11728 1726882206.45612: variable 'ansible_search_path' from source: unknown 11728 1726882206.45615: calling self._execute() 11728 1726882206.45790: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.45910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.45931: variable 'omit' from source: magic vars 11728 1726882206.46706: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.46900: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.46905: _execute() done 11728 1726882206.46908: dumping result to json 11728 1726882206.46910: done dumping result, returning 11728 1726882206.46913: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-5c28-a762-000000000693] 11728 1726882206.46915: sending task result for task 12673a56-9f93-5c28-a762-000000000693 11728 1726882206.46986: done sending task result for task 12673a56-9f93-5c28-a762-000000000693 11728 1726882206.46989: WORKER PROCESS EXITING 11728 1726882206.47036: no more pending results, returning what we have 11728 1726882206.47041: in VariableManager get_vars() 11728 1726882206.47087: Calling all_inventory to load vars for managed_node3 11728 1726882206.47090: Calling groups_inventory to load vars for managed_node3 11728 1726882206.47097: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.47110: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.47113: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.47117: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.50404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.53600: done with get_vars() 11728 1726882206.53627: variable 'ansible_search_path' from source: unknown 11728 1726882206.53629: variable 'ansible_search_path' from source: unknown 11728 1726882206.53673: we have included files to process 11728 1726882206.53674: generating all_blocks data 11728 1726882206.53676: done generating all_blocks data 11728 1726882206.53678: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882206.53679: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882206.53681: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882206.54776: done processing included file 11728 1726882206.54778: iterating over new_blocks loaded from include file 11728 1726882206.54780: in VariableManager get_vars() 11728 1726882206.54913: done with get_vars() 11728 1726882206.54916: filtering new block on tags 11728 1726882206.54948: done filtering new block on tags 11728 1726882206.54951: in VariableManager get_vars() 11728 1726882206.54977: done with get_vars() 11728 1726882206.54979: filtering new block on tags 11728 1726882206.55234: done filtering new block on tags 11728 1726882206.55237: in VariableManager get_vars() 11728 1726882206.55261: done with get_vars() 11728 1726882206.55263: filtering new block on tags 11728 1726882206.55313: done filtering new block on tags 11728 1726882206.55315: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11728 1726882206.55321: extending task lists for all hosts with included blocks 11728 1726882206.58944: done extending task lists 11728 1726882206.58945: done processing included files 11728 1726882206.58946: results queue empty 11728 1726882206.58947: checking for any_errors_fatal 11728 1726882206.58951: done checking for any_errors_fatal 11728 1726882206.58951: checking for max_fail_percentage 11728 1726882206.58952: done checking for max_fail_percentage 11728 1726882206.58953: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.58954: done checking to see if all hosts have failed 11728 1726882206.58954: getting the remaining hosts for this loop 11728 1726882206.58956: done getting the remaining hosts for this loop 11728 1726882206.58958: getting the next task for host managed_node3 11728 1726882206.58963: done getting next task for host managed_node3 11728 1726882206.58966: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882206.58971: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.58981: getting variables 11728 1726882206.58983: in VariableManager get_vars() 11728 1726882206.59205: Calling all_inventory to load vars for managed_node3 11728 1726882206.59208: Calling groups_inventory to load vars for managed_node3 11728 1726882206.59210: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.59217: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.59220: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.59223: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.61598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.64882: done with get_vars() 11728 1726882206.64910: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:30:06 -0400 (0:00:00.211) 0:00:31.502 ****** 11728 1726882206.65198: entering _queue_task() for managed_node3/setup 11728 1726882206.65761: worker is 1 (out of 1 available) 11728 1726882206.65773: exiting _queue_task() for managed_node3/setup 11728 1726882206.65786: done queuing things up, now waiting for results queue to drain 11728 1726882206.65788: waiting for pending results... 11728 1726882206.67110: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882206.67299: in run() - task 12673a56-9f93-5c28-a762-0000000007c9 11728 1726882206.67303: variable 'ansible_search_path' from source: unknown 11728 1726882206.67306: variable 'ansible_search_path' from source: unknown 11728 1726882206.67309: calling self._execute() 11728 1726882206.67312: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.67898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.67902: variable 'omit' from source: magic vars 11728 1726882206.68454: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.68999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.69699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882206.75491: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882206.75765: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882206.75811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882206.75850: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882206.75881: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882206.75964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882206.76231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882206.76261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882206.76308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882206.76329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882206.76386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882206.76698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882206.76702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882206.76705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882206.76712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882206.76990: variable '__network_required_facts' from source: role '' defaults 11728 1726882206.77007: variable 'ansible_facts' from source: unknown 11728 1726882206.78568: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11728 1726882206.78578: when evaluation is False, skipping this task 11728 1726882206.78585: _execute() done 11728 1726882206.78592: dumping result to json 11728 1726882206.78605: done dumping result, returning 11728 1726882206.78620: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-5c28-a762-0000000007c9] 11728 1726882206.78690: sending task result for task 12673a56-9f93-5c28-a762-0000000007c9 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882206.78844: no more pending results, returning what we have 11728 1726882206.78850: results queue empty 11728 1726882206.78851: checking for any_errors_fatal 11728 1726882206.78853: done checking for any_errors_fatal 11728 1726882206.78853: checking for max_fail_percentage 11728 1726882206.78856: done checking for max_fail_percentage 11728 1726882206.78856: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.78857: done checking to see if all hosts have failed 11728 1726882206.78858: getting the remaining hosts for this loop 11728 1726882206.78860: done getting the remaining hosts for this loop 11728 1726882206.78864: getting the next task for host managed_node3 11728 1726882206.78875: done getting next task for host managed_node3 11728 1726882206.78879: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882206.78886: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.78906: getting variables 11728 1726882206.78908: in VariableManager get_vars() 11728 1726882206.78948: Calling all_inventory to load vars for managed_node3 11728 1726882206.78951: Calling groups_inventory to load vars for managed_node3 11728 1726882206.78954: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.78965: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.78968: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.78971: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.79801: done sending task result for task 12673a56-9f93-5c28-a762-0000000007c9 11728 1726882206.79811: WORKER PROCESS EXITING 11728 1726882206.81977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.85184: done with get_vars() 11728 1726882206.85221: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:30:06 -0400 (0:00:00.205) 0:00:31.707 ****** 11728 1726882206.85533: entering _queue_task() for managed_node3/stat 11728 1726882206.86086: worker is 1 (out of 1 available) 11728 1726882206.86502: exiting _queue_task() for managed_node3/stat 11728 1726882206.86513: done queuing things up, now waiting for results queue to drain 11728 1726882206.86514: waiting for pending results... 11728 1726882206.86728: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882206.87165: in run() - task 12673a56-9f93-5c28-a762-0000000007cb 11728 1726882206.87187: variable 'ansible_search_path' from source: unknown 11728 1726882206.87198: variable 'ansible_search_path' from source: unknown 11728 1726882206.87245: calling self._execute() 11728 1726882206.87474: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.87486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.87503: variable 'omit' from source: magic vars 11728 1726882206.88223: variable 'ansible_distribution_major_version' from source: facts 11728 1726882206.88314: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882206.88587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882206.89200: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882206.89399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882206.89402: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882206.89419: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882206.89510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882206.89630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882206.89691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882206.89801: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882206.90011: variable '__network_is_ostree' from source: set_fact 11728 1726882206.90025: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882206.90034: when evaluation is False, skipping this task 11728 1726882206.90042: _execute() done 11728 1726882206.90049: dumping result to json 11728 1726882206.90057: done dumping result, returning 11728 1726882206.90069: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-5c28-a762-0000000007cb] 11728 1726882206.90081: sending task result for task 12673a56-9f93-5c28-a762-0000000007cb 11728 1726882206.90385: done sending task result for task 12673a56-9f93-5c28-a762-0000000007cb 11728 1726882206.90388: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882206.90474: no more pending results, returning what we have 11728 1726882206.90478: results queue empty 11728 1726882206.90479: checking for any_errors_fatal 11728 1726882206.90488: done checking for any_errors_fatal 11728 1726882206.90489: checking for max_fail_percentage 11728 1726882206.90491: done checking for max_fail_percentage 11728 1726882206.90491: checking to see if all hosts have failed and the running result is not ok 11728 1726882206.90492: done checking to see if all hosts have failed 11728 1726882206.90495: getting the remaining hosts for this loop 11728 1726882206.90497: done getting the remaining hosts for this loop 11728 1726882206.90501: getting the next task for host managed_node3 11728 1726882206.90510: done getting next task for host managed_node3 11728 1726882206.90514: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882206.90521: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882206.90539: getting variables 11728 1726882206.90540: in VariableManager get_vars() 11728 1726882206.90582: Calling all_inventory to load vars for managed_node3 11728 1726882206.90585: Calling groups_inventory to load vars for managed_node3 11728 1726882206.90588: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882206.90803: Calling all_plugins_play to load vars for managed_node3 11728 1726882206.90807: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882206.90810: Calling groups_plugins_play to load vars for managed_node3 11728 1726882206.93975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882206.97230: done with get_vars() 11728 1726882206.97260: done getting variables 11728 1726882206.97324: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:30:06 -0400 (0:00:00.118) 0:00:31.826 ****** 11728 1726882206.97365: entering _queue_task() for managed_node3/set_fact 11728 1726882206.97922: worker is 1 (out of 1 available) 11728 1726882206.97934: exiting _queue_task() for managed_node3/set_fact 11728 1726882206.97946: done queuing things up, now waiting for results queue to drain 11728 1726882206.97948: waiting for pending results... 11728 1726882206.98814: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882206.98878: in run() - task 12673a56-9f93-5c28-a762-0000000007cc 11728 1726882206.99035: variable 'ansible_search_path' from source: unknown 11728 1726882206.99044: variable 'ansible_search_path' from source: unknown 11728 1726882206.99085: calling self._execute() 11728 1726882206.99261: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882206.99274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882206.99356: variable 'omit' from source: magic vars 11728 1726882207.00069: variable 'ansible_distribution_major_version' from source: facts 11728 1726882207.00122: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882207.00413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882207.00982: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882207.01071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882207.01169: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882207.01278: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882207.01571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882207.01575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882207.01578: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882207.01582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882207.01754: variable '__network_is_ostree' from source: set_fact 11728 1726882207.01800: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882207.02008: when evaluation is False, skipping this task 11728 1726882207.02011: _execute() done 11728 1726882207.02014: dumping result to json 11728 1726882207.02016: done dumping result, returning 11728 1726882207.02019: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-5c28-a762-0000000007cc] 11728 1726882207.02022: sending task result for task 12673a56-9f93-5c28-a762-0000000007cc 11728 1726882207.02090: done sending task result for task 12673a56-9f93-5c28-a762-0000000007cc 11728 1726882207.02095: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882207.02158: no more pending results, returning what we have 11728 1726882207.02162: results queue empty 11728 1726882207.02163: checking for any_errors_fatal 11728 1726882207.02171: done checking for any_errors_fatal 11728 1726882207.02171: checking for max_fail_percentage 11728 1726882207.02174: done checking for max_fail_percentage 11728 1726882207.02174: checking to see if all hosts have failed and the running result is not ok 11728 1726882207.02175: done checking to see if all hosts have failed 11728 1726882207.02176: getting the remaining hosts for this loop 11728 1726882207.02178: done getting the remaining hosts for this loop 11728 1726882207.02181: getting the next task for host managed_node3 11728 1726882207.02195: done getting next task for host managed_node3 11728 1726882207.02200: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882207.02207: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882207.02226: getting variables 11728 1726882207.02227: in VariableManager get_vars() 11728 1726882207.02270: Calling all_inventory to load vars for managed_node3 11728 1726882207.02273: Calling groups_inventory to load vars for managed_node3 11728 1726882207.02276: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882207.02289: Calling all_plugins_play to load vars for managed_node3 11728 1726882207.02292: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882207.02601: Calling groups_plugins_play to load vars for managed_node3 11728 1726882207.05291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882207.08522: done with get_vars() 11728 1726882207.08554: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:30:07 -0400 (0:00:00.113) 0:00:31.940 ****** 11728 1726882207.08767: entering _queue_task() for managed_node3/service_facts 11728 1726882207.09423: worker is 1 (out of 1 available) 11728 1726882207.09435: exiting _queue_task() for managed_node3/service_facts 11728 1726882207.09448: done queuing things up, now waiting for results queue to drain 11728 1726882207.09449: waiting for pending results... 11728 1726882207.10132: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882207.10556: in run() - task 12673a56-9f93-5c28-a762-0000000007ce 11728 1726882207.10559: variable 'ansible_search_path' from source: unknown 11728 1726882207.10562: variable 'ansible_search_path' from source: unknown 11728 1726882207.10565: calling self._execute() 11728 1726882207.10901: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882207.10905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882207.10908: variable 'omit' from source: magic vars 11728 1726882207.11484: variable 'ansible_distribution_major_version' from source: facts 11728 1726882207.11567: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882207.11579: variable 'omit' from source: magic vars 11728 1726882207.11796: variable 'omit' from source: magic vars 11728 1726882207.11834: variable 'omit' from source: magic vars 11728 1726882207.11992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882207.12037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882207.12059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882207.12082: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882207.12299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882207.12303: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882207.12306: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882207.12310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882207.12475: Set connection var ansible_connection to ssh 11728 1726882207.12489: Set connection var ansible_shell_executable to /bin/sh 11728 1726882207.12502: Set connection var ansible_timeout to 10 11728 1726882207.12509: Set connection var ansible_shell_type to sh 11728 1726882207.12520: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882207.12531: Set connection var ansible_pipelining to False 11728 1726882207.12699: variable 'ansible_shell_executable' from source: unknown 11728 1726882207.12702: variable 'ansible_connection' from source: unknown 11728 1726882207.12705: variable 'ansible_module_compression' from source: unknown 11728 1726882207.12707: variable 'ansible_shell_type' from source: unknown 11728 1726882207.12709: variable 'ansible_shell_executable' from source: unknown 11728 1726882207.12711: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882207.12713: variable 'ansible_pipelining' from source: unknown 11728 1726882207.12715: variable 'ansible_timeout' from source: unknown 11728 1726882207.12717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882207.13026: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882207.13098: variable 'omit' from source: magic vars 11728 1726882207.13109: starting attempt loop 11728 1726882207.13116: running the handler 11728 1726882207.13135: _low_level_execute_command(): starting 11728 1726882207.13398: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882207.14701: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882207.14805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882207.14940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882207.14989: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882207.16828: stdout chunk (state=3): >>>/root <<< 11728 1726882207.16866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882207.16976: stderr chunk (state=3): >>><<< 11728 1726882207.16985: stdout chunk (state=3): >>><<< 11728 1726882207.17016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882207.17036: _low_level_execute_command(): starting 11728 1726882207.17085: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936 `" && echo ansible-tmp-1726882207.1702332-13368-147533453339936="` echo /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936 `" ) && sleep 0' 11728 1726882207.18365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882207.18434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882207.18481: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882207.18506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882207.18637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882207.20498: stdout chunk (state=3): >>>ansible-tmp-1726882207.1702332-13368-147533453339936=/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936 <<< 11728 1726882207.20645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882207.20745: stdout chunk (state=3): >>><<< 11728 1726882207.20749: stderr chunk (state=3): >>><<< 11728 1726882207.20752: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882207.1702332-13368-147533453339936=/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882207.20802: variable 'ansible_module_compression' from source: unknown 11728 1726882207.21001: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11728 1726882207.21100: variable 'ansible_facts' from source: unknown 11728 1726882207.21500: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py 11728 1726882207.21552: Sending initial data 11728 1726882207.21561: Sent initial data (162 bytes) 11728 1726882207.23015: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882207.23029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882207.23046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882207.23326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882207.24828: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882207.24870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882207.24928: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmphi66xrv_ /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py <<< 11728 1726882207.24931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py" <<< 11728 1726882207.24971: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmphi66xrv_" to remote "/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py" <<< 11728 1726882207.26701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882207.26705: stdout chunk (state=3): >>><<< 11728 1726882207.26707: stderr chunk (state=3): >>><<< 11728 1726882207.26709: done transferring module to remote 11728 1726882207.26711: _low_level_execute_command(): starting 11728 1726882207.26715: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/ /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py && sleep 0' 11728 1726882207.28011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882207.28233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882207.28275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882207.30010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882207.30024: stdout chunk (state=3): >>><<< 11728 1726882207.30035: stderr chunk (state=3): >>><<< 11728 1726882207.30052: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882207.30064: _low_level_execute_command(): starting 11728 1726882207.30203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/AnsiballZ_service_facts.py && sleep 0' 11728 1726882207.31418: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882207.31538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882207.31614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882208.81308: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11728 1726882208.81371: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive",<<< 11728 1726882208.81401: stdout chunk (state=3): >>> "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11728 1726882208.82883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882208.82887: stdout chunk (state=3): >>><<< 11728 1726882208.82889: stderr chunk (state=3): >>><<< 11728 1726882208.83100: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882208.84065: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882208.84080: _low_level_execute_command(): starting 11728 1726882208.84090: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882207.1702332-13368-147533453339936/ > /dev/null 2>&1 && sleep 0' 11728 1726882208.84872: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882208.84887: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882208.84917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882208.84934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882208.84955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882208.85040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882208.86868: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882208.86887: stdout chunk (state=3): >>><<< 11728 1726882208.86901: stderr chunk (state=3): >>><<< 11728 1726882208.86921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882208.86931: handler run complete 11728 1726882208.87307: variable 'ansible_facts' from source: unknown 11728 1726882208.87314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882208.87840: variable 'ansible_facts' from source: unknown 11728 1726882208.87999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882208.88225: attempt loop complete, returning result 11728 1726882208.88235: _execute() done 11728 1726882208.88241: dumping result to json 11728 1726882208.88316: done dumping result, returning 11728 1726882208.88331: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-5c28-a762-0000000007ce] 11728 1726882208.88340: sending task result for task 12673a56-9f93-5c28-a762-0000000007ce 11728 1726882208.89500: done sending task result for task 12673a56-9f93-5c28-a762-0000000007ce 11728 1726882208.89504: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882208.89587: no more pending results, returning what we have 11728 1726882208.89590: results queue empty 11728 1726882208.89591: checking for any_errors_fatal 11728 1726882208.89596: done checking for any_errors_fatal 11728 1726882208.89597: checking for max_fail_percentage 11728 1726882208.89598: done checking for max_fail_percentage 11728 1726882208.89599: checking to see if all hosts have failed and the running result is not ok 11728 1726882208.89599: done checking to see if all hosts have failed 11728 1726882208.89600: getting the remaining hosts for this loop 11728 1726882208.89601: done getting the remaining hosts for this loop 11728 1726882208.89603: getting the next task for host managed_node3 11728 1726882208.89607: done getting next task for host managed_node3 11728 1726882208.89609: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882208.89615: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882208.89622: getting variables 11728 1726882208.89623: in VariableManager get_vars() 11728 1726882208.89643: Calling all_inventory to load vars for managed_node3 11728 1726882208.89645: Calling groups_inventory to load vars for managed_node3 11728 1726882208.89646: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882208.89652: Calling all_plugins_play to load vars for managed_node3 11728 1726882208.89653: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882208.89655: Calling groups_plugins_play to load vars for managed_node3 11728 1726882208.90332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882208.91300: done with get_vars() 11728 1726882208.91320: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:30:08 -0400 (0:00:01.826) 0:00:33.766 ****** 11728 1726882208.91414: entering _queue_task() for managed_node3/package_facts 11728 1726882208.91730: worker is 1 (out of 1 available) 11728 1726882208.91744: exiting _queue_task() for managed_node3/package_facts 11728 1726882208.91755: done queuing things up, now waiting for results queue to drain 11728 1726882208.91756: waiting for pending results... 11728 1726882208.92112: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882208.92204: in run() - task 12673a56-9f93-5c28-a762-0000000007cf 11728 1726882208.92227: variable 'ansible_search_path' from source: unknown 11728 1726882208.92237: variable 'ansible_search_path' from source: unknown 11728 1726882208.92279: calling self._execute() 11728 1726882208.92370: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882208.92381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882208.92399: variable 'omit' from source: magic vars 11728 1726882208.92768: variable 'ansible_distribution_major_version' from source: facts 11728 1726882208.92784: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882208.92886: variable 'omit' from source: magic vars 11728 1726882208.92890: variable 'omit' from source: magic vars 11728 1726882208.92940: variable 'omit' from source: magic vars 11728 1726882208.92966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882208.92995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882208.93013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882208.93026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882208.93052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882208.93068: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882208.93071: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882208.93074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882208.93148: Set connection var ansible_connection to ssh 11728 1726882208.93158: Set connection var ansible_shell_executable to /bin/sh 11728 1726882208.93160: Set connection var ansible_timeout to 10 11728 1726882208.93163: Set connection var ansible_shell_type to sh 11728 1726882208.93165: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882208.93167: Set connection var ansible_pipelining to False 11728 1726882208.93186: variable 'ansible_shell_executable' from source: unknown 11728 1726882208.93189: variable 'ansible_connection' from source: unknown 11728 1726882208.93191: variable 'ansible_module_compression' from source: unknown 11728 1726882208.93195: variable 'ansible_shell_type' from source: unknown 11728 1726882208.93200: variable 'ansible_shell_executable' from source: unknown 11728 1726882208.93203: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882208.93206: variable 'ansible_pipelining' from source: unknown 11728 1726882208.93209: variable 'ansible_timeout' from source: unknown 11728 1726882208.93213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882208.93357: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882208.93376: variable 'omit' from source: magic vars 11728 1726882208.93379: starting attempt loop 11728 1726882208.93382: running the handler 11728 1726882208.93387: _low_level_execute_command(): starting 11728 1726882208.93396: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882208.93868: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882208.93905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882208.93910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882208.93913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882208.93940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882208.93943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882208.93977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882208.93981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882208.93985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882208.94039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882208.95692: stdout chunk (state=3): >>>/root <<< 11728 1726882208.95854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882208.95857: stdout chunk (state=3): >>><<< 11728 1726882208.95860: stderr chunk (state=3): >>><<< 11728 1726882208.95931: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882208.95936: _low_level_execute_command(): starting 11728 1726882208.95942: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578 `" && echo ansible-tmp-1726882208.9588728-13427-215016035678578="` echo /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578 `" ) && sleep 0' 11728 1726882208.96373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882208.96378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882208.96388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882208.96391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882208.96437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882208.96443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882208.96444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882208.96487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882208.98777: stdout chunk (state=3): >>>ansible-tmp-1726882208.9588728-13427-215016035678578=/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578 <<< 11728 1726882208.98780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882208.98785: stdout chunk (state=3): >>><<< 11728 1726882208.98787: stderr chunk (state=3): >>><<< 11728 1726882208.98789: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882208.9588728-13427-215016035678578=/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882208.98903: variable 'ansible_module_compression' from source: unknown 11728 1726882208.98918: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11728 1726882208.98980: variable 'ansible_facts' from source: unknown 11728 1726882208.99204: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py 11728 1726882208.99585: Sending initial data 11728 1726882208.99588: Sent initial data (162 bytes) 11728 1726882209.00960: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882209.00973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882209.01009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.01153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882209.01158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.01257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882209.01320: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882209.01405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882209.02924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882209.02961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882209.03015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_gjk37xq /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py <<< 11728 1726882209.03019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py" <<< 11728 1726882209.03099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_gjk37xq" to remote "/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py" <<< 11728 1726882209.06114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882209.06212: stdout chunk (state=3): >>><<< 11728 1726882209.06215: stderr chunk (state=3): >>><<< 11728 1726882209.06218: done transferring module to remote 11728 1726882209.06220: _low_level_execute_command(): starting 11728 1726882209.06234: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/ /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py && sleep 0' 11728 1726882209.06910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882209.07011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.07064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882209.07092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882209.07210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882209.09101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882209.09104: stdout chunk (state=3): >>><<< 11728 1726882209.09107: stderr chunk (state=3): >>><<< 11728 1726882209.09109: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882209.09111: _low_level_execute_command(): starting 11728 1726882209.09113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/AnsiballZ_package_facts.py && sleep 0' 11728 1726882209.10301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882209.10305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.10348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.10378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882209.10381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882209.10744: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882209.54213: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11728 1726882209.54234: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11728 1726882209.54263: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11728 1726882209.54420: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11728 1726882209.54432: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11728 1726882209.54439: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11728 1726882209.54448: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11728 1726882209.56084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882209.56120: stderr chunk (state=3): >>><<< 11728 1726882209.56123: stdout chunk (state=3): >>><<< 11728 1726882209.56170: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882209.57808: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882209.57825: _low_level_execute_command(): starting 11728 1726882209.57829: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882208.9588728-13427-215016035678578/ > /dev/null 2>&1 && sleep 0' 11728 1726882209.58282: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882209.58306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882209.58310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882209.58313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.58324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882209.58375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882209.58379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882209.58388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882209.58456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882209.60241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882209.60271: stderr chunk (state=3): >>><<< 11728 1726882209.60274: stdout chunk (state=3): >>><<< 11728 1726882209.60291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882209.60298: handler run complete 11728 1726882209.60792: variable 'ansible_facts' from source: unknown 11728 1726882209.61199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.62442: variable 'ansible_facts' from source: unknown 11728 1726882209.62785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.63346: attempt loop complete, returning result 11728 1726882209.63353: _execute() done 11728 1726882209.63356: dumping result to json 11728 1726882209.63523: done dumping result, returning 11728 1726882209.63526: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-5c28-a762-0000000007cf] 11728 1726882209.63531: sending task result for task 12673a56-9f93-5c28-a762-0000000007cf ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882209.65343: no more pending results, returning what we have 11728 1726882209.65346: results queue empty 11728 1726882209.65347: checking for any_errors_fatal 11728 1726882209.65351: done checking for any_errors_fatal 11728 1726882209.65352: checking for max_fail_percentage 11728 1726882209.65354: done checking for max_fail_percentage 11728 1726882209.65355: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.65355: done checking to see if all hosts have failed 11728 1726882209.65356: getting the remaining hosts for this loop 11728 1726882209.65357: done getting the remaining hosts for this loop 11728 1726882209.65362: getting the next task for host managed_node3 11728 1726882209.65375: done getting next task for host managed_node3 11728 1726882209.65378: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882209.65384: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.65392: done sending task result for task 12673a56-9f93-5c28-a762-0000000007cf 11728 1726882209.65400: WORKER PROCESS EXITING 11728 1726882209.65406: getting variables 11728 1726882209.65408: in VariableManager get_vars() 11728 1726882209.65430: Calling all_inventory to load vars for managed_node3 11728 1726882209.65432: Calling groups_inventory to load vars for managed_node3 11728 1726882209.65433: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.65445: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.65448: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.65451: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.66529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.67711: done with get_vars() 11728 1726882209.67737: done getting variables 11728 1726882209.67780: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:30:09 -0400 (0:00:00.763) 0:00:34.530 ****** 11728 1726882209.67810: entering _queue_task() for managed_node3/debug 11728 1726882209.68049: worker is 1 (out of 1 available) 11728 1726882209.68062: exiting _queue_task() for managed_node3/debug 11728 1726882209.68075: done queuing things up, now waiting for results queue to drain 11728 1726882209.68077: waiting for pending results... 11728 1726882209.68261: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882209.68353: in run() - task 12673a56-9f93-5c28-a762-000000000694 11728 1726882209.68370: variable 'ansible_search_path' from source: unknown 11728 1726882209.68398: variable 'ansible_search_path' from source: unknown 11728 1726882209.68425: calling self._execute() 11728 1726882209.68490: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.68498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.68505: variable 'omit' from source: magic vars 11728 1726882209.68780: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.68789: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.68798: variable 'omit' from source: magic vars 11728 1726882209.68852: variable 'omit' from source: magic vars 11728 1726882209.68916: variable 'network_provider' from source: set_fact 11728 1726882209.68930: variable 'omit' from source: magic vars 11728 1726882209.68965: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882209.68991: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882209.69008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882209.69021: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882209.69031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882209.69058: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882209.69102: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.69106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.69157: Set connection var ansible_connection to ssh 11728 1726882209.69166: Set connection var ansible_shell_executable to /bin/sh 11728 1726882209.69199: Set connection var ansible_timeout to 10 11728 1726882209.69205: Set connection var ansible_shell_type to sh 11728 1726882209.69208: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882209.69210: Set connection var ansible_pipelining to False 11728 1726882209.69230: variable 'ansible_shell_executable' from source: unknown 11728 1726882209.69233: variable 'ansible_connection' from source: unknown 11728 1726882209.69235: variable 'ansible_module_compression' from source: unknown 11728 1726882209.69238: variable 'ansible_shell_type' from source: unknown 11728 1726882209.69240: variable 'ansible_shell_executable' from source: unknown 11728 1726882209.69242: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.69244: variable 'ansible_pipelining' from source: unknown 11728 1726882209.69246: variable 'ansible_timeout' from source: unknown 11728 1726882209.69248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.69379: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882209.69418: variable 'omit' from source: magic vars 11728 1726882209.69421: starting attempt loop 11728 1726882209.69424: running the handler 11728 1726882209.69463: handler run complete 11728 1726882209.69512: attempt loop complete, returning result 11728 1726882209.69517: _execute() done 11728 1726882209.69531: dumping result to json 11728 1726882209.69534: done dumping result, returning 11728 1726882209.69536: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-5c28-a762-000000000694] 11728 1726882209.69539: sending task result for task 12673a56-9f93-5c28-a762-000000000694 11728 1726882209.69609: done sending task result for task 12673a56-9f93-5c28-a762-000000000694 11728 1726882209.69612: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11728 1726882209.69683: no more pending results, returning what we have 11728 1726882209.69687: results queue empty 11728 1726882209.69688: checking for any_errors_fatal 11728 1726882209.69698: done checking for any_errors_fatal 11728 1726882209.69699: checking for max_fail_percentage 11728 1726882209.69700: done checking for max_fail_percentage 11728 1726882209.69701: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.69702: done checking to see if all hosts have failed 11728 1726882209.69703: getting the remaining hosts for this loop 11728 1726882209.69704: done getting the remaining hosts for this loop 11728 1726882209.69707: getting the next task for host managed_node3 11728 1726882209.69714: done getting next task for host managed_node3 11728 1726882209.69720: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882209.69725: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.69738: getting variables 11728 1726882209.69741: in VariableManager get_vars() 11728 1726882209.69770: Calling all_inventory to load vars for managed_node3 11728 1726882209.69773: Calling groups_inventory to load vars for managed_node3 11728 1726882209.69775: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.69784: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.69787: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.69790: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.70836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.71869: done with get_vars() 11728 1726882209.71886: done getting variables 11728 1726882209.71935: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:30:09 -0400 (0:00:00.041) 0:00:34.572 ****** 11728 1726882209.71976: entering _queue_task() for managed_node3/fail 11728 1726882209.72248: worker is 1 (out of 1 available) 11728 1726882209.72262: exiting _queue_task() for managed_node3/fail 11728 1726882209.72274: done queuing things up, now waiting for results queue to drain 11728 1726882209.72275: waiting for pending results... 11728 1726882209.72506: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882209.72603: in run() - task 12673a56-9f93-5c28-a762-000000000695 11728 1726882209.72615: variable 'ansible_search_path' from source: unknown 11728 1726882209.72621: variable 'ansible_search_path' from source: unknown 11728 1726882209.72650: calling self._execute() 11728 1726882209.72718: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.72726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.72798: variable 'omit' from source: magic vars 11728 1726882209.73083: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.73098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.73187: variable 'network_state' from source: role '' defaults 11728 1726882209.73208: Evaluated conditional (network_state != {}): False 11728 1726882209.73212: when evaluation is False, skipping this task 11728 1726882209.73215: _execute() done 11728 1726882209.73218: dumping result to json 11728 1726882209.73221: done dumping result, returning 11728 1726882209.73228: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-5c28-a762-000000000695] 11728 1726882209.73233: sending task result for task 12673a56-9f93-5c28-a762-000000000695 11728 1726882209.73322: done sending task result for task 12673a56-9f93-5c28-a762-000000000695 11728 1726882209.73325: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882209.73377: no more pending results, returning what we have 11728 1726882209.73381: results queue empty 11728 1726882209.73382: checking for any_errors_fatal 11728 1726882209.73389: done checking for any_errors_fatal 11728 1726882209.73390: checking for max_fail_percentage 11728 1726882209.73392: done checking for max_fail_percentage 11728 1726882209.73392: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.73395: done checking to see if all hosts have failed 11728 1726882209.73396: getting the remaining hosts for this loop 11728 1726882209.73397: done getting the remaining hosts for this loop 11728 1726882209.73400: getting the next task for host managed_node3 11728 1726882209.73408: done getting next task for host managed_node3 11728 1726882209.73412: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882209.73417: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.73435: getting variables 11728 1726882209.73436: in VariableManager get_vars() 11728 1726882209.73470: Calling all_inventory to load vars for managed_node3 11728 1726882209.73472: Calling groups_inventory to load vars for managed_node3 11728 1726882209.73474: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.73483: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.73485: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.73487: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.74395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.75333: done with get_vars() 11728 1726882209.75350: done getting variables 11728 1726882209.75397: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:30:09 -0400 (0:00:00.034) 0:00:34.606 ****** 11728 1726882209.75423: entering _queue_task() for managed_node3/fail 11728 1726882209.75682: worker is 1 (out of 1 available) 11728 1726882209.75699: exiting _queue_task() for managed_node3/fail 11728 1726882209.75712: done queuing things up, now waiting for results queue to drain 11728 1726882209.75714: waiting for pending results... 11728 1726882209.75907: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882209.75992: in run() - task 12673a56-9f93-5c28-a762-000000000696 11728 1726882209.76007: variable 'ansible_search_path' from source: unknown 11728 1726882209.76010: variable 'ansible_search_path' from source: unknown 11728 1726882209.76044: calling self._execute() 11728 1726882209.76117: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.76121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.76129: variable 'omit' from source: magic vars 11728 1726882209.76433: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.76442: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.76546: variable 'network_state' from source: role '' defaults 11728 1726882209.76549: Evaluated conditional (network_state != {}): False 11728 1726882209.76552: when evaluation is False, skipping this task 11728 1726882209.76556: _execute() done 11728 1726882209.76558: dumping result to json 11728 1726882209.76561: done dumping result, returning 11728 1726882209.76568: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-5c28-a762-000000000696] 11728 1726882209.76581: sending task result for task 12673a56-9f93-5c28-a762-000000000696 11728 1726882209.76676: done sending task result for task 12673a56-9f93-5c28-a762-000000000696 11728 1726882209.76679: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882209.76760: no more pending results, returning what we have 11728 1726882209.76763: results queue empty 11728 1726882209.76765: checking for any_errors_fatal 11728 1726882209.76770: done checking for any_errors_fatal 11728 1726882209.76771: checking for max_fail_percentage 11728 1726882209.76772: done checking for max_fail_percentage 11728 1726882209.76773: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.76774: done checking to see if all hosts have failed 11728 1726882209.76775: getting the remaining hosts for this loop 11728 1726882209.76776: done getting the remaining hosts for this loop 11728 1726882209.76779: getting the next task for host managed_node3 11728 1726882209.76786: done getting next task for host managed_node3 11728 1726882209.76789: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882209.76796: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.76812: getting variables 11728 1726882209.76813: in VariableManager get_vars() 11728 1726882209.76847: Calling all_inventory to load vars for managed_node3 11728 1726882209.76850: Calling groups_inventory to load vars for managed_node3 11728 1726882209.76852: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.76860: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.76863: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.76865: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.77798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.78777: done with get_vars() 11728 1726882209.78802: done getting variables 11728 1726882209.78857: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:30:09 -0400 (0:00:00.034) 0:00:34.641 ****** 11728 1726882209.78881: entering _queue_task() for managed_node3/fail 11728 1726882209.79116: worker is 1 (out of 1 available) 11728 1726882209.79130: exiting _queue_task() for managed_node3/fail 11728 1726882209.79142: done queuing things up, now waiting for results queue to drain 11728 1726882209.79143: waiting for pending results... 11728 1726882209.79330: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882209.79481: in run() - task 12673a56-9f93-5c28-a762-000000000697 11728 1726882209.79485: variable 'ansible_search_path' from source: unknown 11728 1726882209.79488: variable 'ansible_search_path' from source: unknown 11728 1726882209.79505: calling self._execute() 11728 1726882209.79568: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.79572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.79582: variable 'omit' from source: magic vars 11728 1726882209.79851: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.79860: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.79991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882209.81749: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882209.81801: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882209.81836: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882209.81916: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882209.81919: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882209.81969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.82002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.82025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.82058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.82100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.82160: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.82173: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11728 1726882209.82253: variable 'ansible_distribution' from source: facts 11728 1726882209.82257: variable '__network_rh_distros' from source: role '' defaults 11728 1726882209.82265: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11728 1726882209.82426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.82444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.82464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.82488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.82504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.82536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.82551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.82572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.82597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.82610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.82654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.82672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.82691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.82753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.82757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.83013: variable 'network_connections' from source: task vars 11728 1726882209.83016: variable 'port2_profile' from source: play vars 11728 1726882209.83055: variable 'port2_profile' from source: play vars 11728 1726882209.83103: variable 'port1_profile' from source: play vars 11728 1726882209.83131: variable 'port1_profile' from source: play vars 11728 1726882209.83138: variable 'controller_profile' from source: play vars 11728 1726882209.83180: variable 'controller_profile' from source: play vars 11728 1726882209.83187: variable 'network_state' from source: role '' defaults 11728 1726882209.83246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882209.83390: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882209.83421: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882209.83449: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882209.83466: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882209.83515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882209.83528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882209.83548: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.83567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882209.83589: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11728 1726882209.83592: when evaluation is False, skipping this task 11728 1726882209.83599: _execute() done 11728 1726882209.83601: dumping result to json 11728 1726882209.83603: done dumping result, returning 11728 1726882209.83630: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-5c28-a762-000000000697] 11728 1726882209.83633: sending task result for task 12673a56-9f93-5c28-a762-000000000697 11728 1726882209.83721: done sending task result for task 12673a56-9f93-5c28-a762-000000000697 11728 1726882209.83724: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11728 1726882209.83770: no more pending results, returning what we have 11728 1726882209.83774: results queue empty 11728 1726882209.83775: checking for any_errors_fatal 11728 1726882209.83785: done checking for any_errors_fatal 11728 1726882209.83786: checking for max_fail_percentage 11728 1726882209.83788: done checking for max_fail_percentage 11728 1726882209.83788: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.83789: done checking to see if all hosts have failed 11728 1726882209.83789: getting the remaining hosts for this loop 11728 1726882209.83791: done getting the remaining hosts for this loop 11728 1726882209.83796: getting the next task for host managed_node3 11728 1726882209.83804: done getting next task for host managed_node3 11728 1726882209.83807: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882209.83813: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.83829: getting variables 11728 1726882209.83830: in VariableManager get_vars() 11728 1726882209.83865: Calling all_inventory to load vars for managed_node3 11728 1726882209.83868: Calling groups_inventory to load vars for managed_node3 11728 1726882209.83870: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.83879: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.83881: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.83884: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.84794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.85780: done with get_vars() 11728 1726882209.85799: done getting variables 11728 1726882209.85852: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:30:09 -0400 (0:00:00.070) 0:00:34.711 ****** 11728 1726882209.85891: entering _queue_task() for managed_node3/dnf 11728 1726882209.86160: worker is 1 (out of 1 available) 11728 1726882209.86174: exiting _queue_task() for managed_node3/dnf 11728 1726882209.86187: done queuing things up, now waiting for results queue to drain 11728 1726882209.86188: waiting for pending results... 11728 1726882209.86384: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882209.86483: in run() - task 12673a56-9f93-5c28-a762-000000000698 11728 1726882209.86496: variable 'ansible_search_path' from source: unknown 11728 1726882209.86500: variable 'ansible_search_path' from source: unknown 11728 1726882209.86533: calling self._execute() 11728 1726882209.86600: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.86606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.86616: variable 'omit' from source: magic vars 11728 1726882209.87078: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.87240: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.87323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882209.89132: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882209.89184: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882209.89217: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882209.89248: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882209.89271: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882209.89501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.89505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.89508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.89511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.89513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.89585: variable 'ansible_distribution' from source: facts 11728 1726882209.89601: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.89627: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11728 1726882209.89746: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882209.89876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.89920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.89946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.89985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.90007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.90051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.90501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.90505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.90507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.90508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.90510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.90512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.90515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.90530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.90621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.90775: variable 'network_connections' from source: task vars 11728 1726882209.90792: variable 'port2_profile' from source: play vars 11728 1726882209.90862: variable 'port2_profile' from source: play vars 11728 1726882209.90879: variable 'port1_profile' from source: play vars 11728 1726882209.90943: variable 'port1_profile' from source: play vars 11728 1726882209.90957: variable 'controller_profile' from source: play vars 11728 1726882209.91021: variable 'controller_profile' from source: play vars 11728 1726882209.91092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882209.91281: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882209.91315: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882209.91342: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882209.91363: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882209.91396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882209.91422: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882209.91439: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.91457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882209.91497: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882209.91673: variable 'network_connections' from source: task vars 11728 1726882209.91677: variable 'port2_profile' from source: play vars 11728 1726882209.91723: variable 'port2_profile' from source: play vars 11728 1726882209.91730: variable 'port1_profile' from source: play vars 11728 1726882209.91774: variable 'port1_profile' from source: play vars 11728 1726882209.91781: variable 'controller_profile' from source: play vars 11728 1726882209.91825: variable 'controller_profile' from source: play vars 11728 1726882209.91844: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882209.91847: when evaluation is False, skipping this task 11728 1726882209.91850: _execute() done 11728 1726882209.91854: dumping result to json 11728 1726882209.91857: done dumping result, returning 11728 1726882209.91867: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000698] 11728 1726882209.91869: sending task result for task 12673a56-9f93-5c28-a762-000000000698 11728 1726882209.91960: done sending task result for task 12673a56-9f93-5c28-a762-000000000698 11728 1726882209.91963: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882209.92018: no more pending results, returning what we have 11728 1726882209.92023: results queue empty 11728 1726882209.92024: checking for any_errors_fatal 11728 1726882209.92031: done checking for any_errors_fatal 11728 1726882209.92032: checking for max_fail_percentage 11728 1726882209.92033: done checking for max_fail_percentage 11728 1726882209.92034: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.92035: done checking to see if all hosts have failed 11728 1726882209.92035: getting the remaining hosts for this loop 11728 1726882209.92037: done getting the remaining hosts for this loop 11728 1726882209.92040: getting the next task for host managed_node3 11728 1726882209.92047: done getting next task for host managed_node3 11728 1726882209.92051: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882209.92056: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.92075: getting variables 11728 1726882209.92077: in VariableManager get_vars() 11728 1726882209.92113: Calling all_inventory to load vars for managed_node3 11728 1726882209.92116: Calling groups_inventory to load vars for managed_node3 11728 1726882209.92118: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.92127: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.92130: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.92133: Calling groups_plugins_play to load vars for managed_node3 11728 1726882209.92952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882209.93922: done with get_vars() 11728 1726882209.93944: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882209.94021: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:30:09 -0400 (0:00:00.081) 0:00:34.792 ****** 11728 1726882209.94054: entering _queue_task() for managed_node3/yum 11728 1726882209.94375: worker is 1 (out of 1 available) 11728 1726882209.94387: exiting _queue_task() for managed_node3/yum 11728 1726882209.94401: done queuing things up, now waiting for results queue to drain 11728 1726882209.94402: waiting for pending results... 11728 1726882209.94815: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882209.94853: in run() - task 12673a56-9f93-5c28-a762-000000000699 11728 1726882209.94871: variable 'ansible_search_path' from source: unknown 11728 1726882209.94877: variable 'ansible_search_path' from source: unknown 11728 1726882209.94917: calling self._execute() 11728 1726882209.95012: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882209.95022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882209.95033: variable 'omit' from source: magic vars 11728 1726882209.95409: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.95427: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882209.95612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882209.97886: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882209.97969: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882209.98088: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882209.98091: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882209.98098: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882209.98169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882209.98212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882209.98243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882209.98286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882209.98315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882209.98418: variable 'ansible_distribution_major_version' from source: facts 11728 1726882209.98441: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11728 1726882209.98451: when evaluation is False, skipping this task 11728 1726882209.98457: _execute() done 11728 1726882209.98465: dumping result to json 11728 1726882209.98472: done dumping result, returning 11728 1726882209.98500: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000699] 11728 1726882209.98504: sending task result for task 12673a56-9f93-5c28-a762-000000000699 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11728 1726882209.98747: no more pending results, returning what we have 11728 1726882209.98752: results queue empty 11728 1726882209.98753: checking for any_errors_fatal 11728 1726882209.98761: done checking for any_errors_fatal 11728 1726882209.98761: checking for max_fail_percentage 11728 1726882209.98763: done checking for max_fail_percentage 11728 1726882209.98764: checking to see if all hosts have failed and the running result is not ok 11728 1726882209.98765: done checking to see if all hosts have failed 11728 1726882209.98766: getting the remaining hosts for this loop 11728 1726882209.98768: done getting the remaining hosts for this loop 11728 1726882209.98772: getting the next task for host managed_node3 11728 1726882209.98779: done getting next task for host managed_node3 11728 1726882209.98783: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882209.98788: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882209.98811: getting variables 11728 1726882209.98812: in VariableManager get_vars() 11728 1726882209.98853: Calling all_inventory to load vars for managed_node3 11728 1726882209.98856: Calling groups_inventory to load vars for managed_node3 11728 1726882209.98858: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882209.98869: Calling all_plugins_play to load vars for managed_node3 11728 1726882209.98872: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882209.98875: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.00036: done sending task result for task 12673a56-9f93-5c28-a762-000000000699 11728 1726882210.00040: WORKER PROCESS EXITING 11728 1726882210.00113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.04780: done with get_vars() 11728 1726882210.04801: done getting variables 11728 1726882210.04840: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:30:10 -0400 (0:00:00.108) 0:00:34.901 ****** 11728 1726882210.04863: entering _queue_task() for managed_node3/fail 11728 1726882210.05127: worker is 1 (out of 1 available) 11728 1726882210.05141: exiting _queue_task() for managed_node3/fail 11728 1726882210.05154: done queuing things up, now waiting for results queue to drain 11728 1726882210.05155: waiting for pending results... 11728 1726882210.05343: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882210.05448: in run() - task 12673a56-9f93-5c28-a762-00000000069a 11728 1726882210.05458: variable 'ansible_search_path' from source: unknown 11728 1726882210.05462: variable 'ansible_search_path' from source: unknown 11728 1726882210.05489: calling self._execute() 11728 1726882210.05578: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.05583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.05591: variable 'omit' from source: magic vars 11728 1726882210.06100: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.06104: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.06107: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.06292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882210.08387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882210.08575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882210.08740: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882210.08888: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882210.09199: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882210.09203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.09399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.09403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.09406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.09409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.09411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.09414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.09416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.09635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.09656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.09999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.10003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.10005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.10008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.10010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.10284: variable 'network_connections' from source: task vars 11728 1726882210.10306: variable 'port2_profile' from source: play vars 11728 1726882210.10465: variable 'port2_profile' from source: play vars 11728 1726882210.10482: variable 'port1_profile' from source: play vars 11728 1726882210.10545: variable 'port1_profile' from source: play vars 11728 1726882210.10714: variable 'controller_profile' from source: play vars 11728 1726882210.10775: variable 'controller_profile' from source: play vars 11728 1726882210.10853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882210.11259: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882210.11338: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882210.11392: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882210.11456: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882210.11800: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882210.11804: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882210.11806: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.11833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882210.11890: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882210.12337: variable 'network_connections' from source: task vars 11728 1726882210.12699: variable 'port2_profile' from source: play vars 11728 1726882210.12703: variable 'port2_profile' from source: play vars 11728 1726882210.12705: variable 'port1_profile' from source: play vars 11728 1726882210.12706: variable 'port1_profile' from source: play vars 11728 1726882210.12709: variable 'controller_profile' from source: play vars 11728 1726882210.12711: variable 'controller_profile' from source: play vars 11728 1726882210.12808: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882210.12906: when evaluation is False, skipping this task 11728 1726882210.12913: _execute() done 11728 1726882210.12919: dumping result to json 11728 1726882210.12926: done dumping result, returning 11728 1726882210.12937: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-00000000069a] 11728 1726882210.12946: sending task result for task 12673a56-9f93-5c28-a762-00000000069a skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882210.13097: no more pending results, returning what we have 11728 1726882210.13102: results queue empty 11728 1726882210.13103: checking for any_errors_fatal 11728 1726882210.13111: done checking for any_errors_fatal 11728 1726882210.13112: checking for max_fail_percentage 11728 1726882210.13114: done checking for max_fail_percentage 11728 1726882210.13114: checking to see if all hosts have failed and the running result is not ok 11728 1726882210.13115: done checking to see if all hosts have failed 11728 1726882210.13116: getting the remaining hosts for this loop 11728 1726882210.13118: done getting the remaining hosts for this loop 11728 1726882210.13121: getting the next task for host managed_node3 11728 1726882210.13129: done getting next task for host managed_node3 11728 1726882210.13134: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11728 1726882210.13141: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882210.13161: getting variables 11728 1726882210.13162: in VariableManager get_vars() 11728 1726882210.13422: Calling all_inventory to load vars for managed_node3 11728 1726882210.13426: Calling groups_inventory to load vars for managed_node3 11728 1726882210.13429: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882210.13436: done sending task result for task 12673a56-9f93-5c28-a762-00000000069a 11728 1726882210.13439: WORKER PROCESS EXITING 11728 1726882210.13449: Calling all_plugins_play to load vars for managed_node3 11728 1726882210.13453: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882210.13456: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.15929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.17721: done with get_vars() 11728 1726882210.17754: done getting variables 11728 1726882210.17829: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:30:10 -0400 (0:00:00.130) 0:00:35.031 ****** 11728 1726882210.17876: entering _queue_task() for managed_node3/package 11728 1726882210.18371: worker is 1 (out of 1 available) 11728 1726882210.18383: exiting _queue_task() for managed_node3/package 11728 1726882210.18406: done queuing things up, now waiting for results queue to drain 11728 1726882210.18407: waiting for pending results... 11728 1726882210.18651: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11728 1726882210.18846: in run() - task 12673a56-9f93-5c28-a762-00000000069b 11728 1726882210.18870: variable 'ansible_search_path' from source: unknown 11728 1726882210.18883: variable 'ansible_search_path' from source: unknown 11728 1726882210.18926: calling self._execute() 11728 1726882210.19037: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.19057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.19078: variable 'omit' from source: magic vars 11728 1726882210.19535: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.19553: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.19775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882210.20084: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882210.20136: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882210.20187: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882210.20274: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882210.20405: variable 'network_packages' from source: role '' defaults 11728 1726882210.20533: variable '__network_provider_setup' from source: role '' defaults 11728 1726882210.20589: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882210.20699: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882210.20703: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882210.20756: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882210.21024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882210.22748: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882210.22797: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882210.22823: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882210.22846: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882210.22871: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882210.22928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.22948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.22966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.23000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.23009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.23043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.23058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.23075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.23106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.23117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.23258: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882210.23348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.23363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.23379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.23407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.23419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.23479: variable 'ansible_python' from source: facts 11728 1726882210.23496: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882210.23553: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882210.23610: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882210.23698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.23714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.23731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.23758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.23769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.23804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.23823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.23839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.23868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.23878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.23972: variable 'network_connections' from source: task vars 11728 1726882210.23975: variable 'port2_profile' from source: play vars 11728 1726882210.24048: variable 'port2_profile' from source: play vars 11728 1726882210.24057: variable 'port1_profile' from source: play vars 11728 1726882210.24175: variable 'port1_profile' from source: play vars 11728 1726882210.24179: variable 'controller_profile' from source: play vars 11728 1726882210.24303: variable 'controller_profile' from source: play vars 11728 1726882210.24503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882210.24507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882210.24510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.24512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882210.24515: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.24727: variable 'network_connections' from source: task vars 11728 1726882210.24730: variable 'port2_profile' from source: play vars 11728 1726882210.24835: variable 'port2_profile' from source: play vars 11728 1726882210.24838: variable 'port1_profile' from source: play vars 11728 1726882210.24941: variable 'port1_profile' from source: play vars 11728 1726882210.24945: variable 'controller_profile' from source: play vars 11728 1726882210.25029: variable 'controller_profile' from source: play vars 11728 1726882210.25071: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882210.25177: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.25409: variable 'network_connections' from source: task vars 11728 1726882210.25420: variable 'port2_profile' from source: play vars 11728 1726882210.25472: variable 'port2_profile' from source: play vars 11728 1726882210.25514: variable 'port1_profile' from source: play vars 11728 1726882210.25538: variable 'port1_profile' from source: play vars 11728 1726882210.25604: variable 'controller_profile' from source: play vars 11728 1726882210.25607: variable 'controller_profile' from source: play vars 11728 1726882210.25637: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882210.25709: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882210.25987: variable 'network_connections' from source: task vars 11728 1726882210.25990: variable 'port2_profile' from source: play vars 11728 1726882210.26053: variable 'port2_profile' from source: play vars 11728 1726882210.26060: variable 'port1_profile' from source: play vars 11728 1726882210.26137: variable 'port1_profile' from source: play vars 11728 1726882210.26141: variable 'controller_profile' from source: play vars 11728 1726882210.26203: variable 'controller_profile' from source: play vars 11728 1726882210.26323: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882210.26326: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882210.26329: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882210.26367: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882210.26543: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882210.26850: variable 'network_connections' from source: task vars 11728 1726882210.26853: variable 'port2_profile' from source: play vars 11728 1726882210.26899: variable 'port2_profile' from source: play vars 11728 1726882210.26903: variable 'port1_profile' from source: play vars 11728 1726882210.26944: variable 'port1_profile' from source: play vars 11728 1726882210.26949: variable 'controller_profile' from source: play vars 11728 1726882210.26996: variable 'controller_profile' from source: play vars 11728 1726882210.27001: variable 'ansible_distribution' from source: facts 11728 1726882210.27005: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.27010: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.27022: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882210.27129: variable 'ansible_distribution' from source: facts 11728 1726882210.27132: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.27135: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.27147: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882210.27255: variable 'ansible_distribution' from source: facts 11728 1726882210.27258: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.27262: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.27291: variable 'network_provider' from source: set_fact 11728 1726882210.27305: variable 'ansible_facts' from source: unknown 11728 1726882210.27669: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11728 1726882210.27673: when evaluation is False, skipping this task 11728 1726882210.27675: _execute() done 11728 1726882210.27678: dumping result to json 11728 1726882210.27680: done dumping result, returning 11728 1726882210.27688: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-5c28-a762-00000000069b] 11728 1726882210.27692: sending task result for task 12673a56-9f93-5c28-a762-00000000069b 11728 1726882210.27791: done sending task result for task 12673a56-9f93-5c28-a762-00000000069b 11728 1726882210.27797: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11728 1726882210.27847: no more pending results, returning what we have 11728 1726882210.27851: results queue empty 11728 1726882210.27852: checking for any_errors_fatal 11728 1726882210.27858: done checking for any_errors_fatal 11728 1726882210.27858: checking for max_fail_percentage 11728 1726882210.27860: done checking for max_fail_percentage 11728 1726882210.27861: checking to see if all hosts have failed and the running result is not ok 11728 1726882210.27862: done checking to see if all hosts have failed 11728 1726882210.27862: getting the remaining hosts for this loop 11728 1726882210.27864: done getting the remaining hosts for this loop 11728 1726882210.27872: getting the next task for host managed_node3 11728 1726882210.27878: done getting next task for host managed_node3 11728 1726882210.27882: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882210.27886: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882210.27913: getting variables 11728 1726882210.27914: in VariableManager get_vars() 11728 1726882210.27952: Calling all_inventory to load vars for managed_node3 11728 1726882210.27954: Calling groups_inventory to load vars for managed_node3 11728 1726882210.27957: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882210.27966: Calling all_plugins_play to load vars for managed_node3 11728 1726882210.27968: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882210.27970: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.28935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.30245: done with get_vars() 11728 1726882210.30262: done getting variables 11728 1726882210.30308: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:30:10 -0400 (0:00:00.124) 0:00:35.155 ****** 11728 1726882210.30332: entering _queue_task() for managed_node3/package 11728 1726882210.30558: worker is 1 (out of 1 available) 11728 1726882210.30571: exiting _queue_task() for managed_node3/package 11728 1726882210.30583: done queuing things up, now waiting for results queue to drain 11728 1726882210.30584: waiting for pending results... 11728 1726882210.30774: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882210.30887: in run() - task 12673a56-9f93-5c28-a762-00000000069c 11728 1726882210.30903: variable 'ansible_search_path' from source: unknown 11728 1726882210.30906: variable 'ansible_search_path' from source: unknown 11728 1726882210.30939: calling self._execute() 11728 1726882210.31006: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.31010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.31018: variable 'omit' from source: magic vars 11728 1726882210.31290: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.31304: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.31386: variable 'network_state' from source: role '' defaults 11728 1726882210.31396: Evaluated conditional (network_state != {}): False 11728 1726882210.31401: when evaluation is False, skipping this task 11728 1726882210.31404: _execute() done 11728 1726882210.31407: dumping result to json 11728 1726882210.31410: done dumping result, returning 11728 1726882210.31418: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-5c28-a762-00000000069c] 11728 1726882210.31422: sending task result for task 12673a56-9f93-5c28-a762-00000000069c 11728 1726882210.31516: done sending task result for task 12673a56-9f93-5c28-a762-00000000069c 11728 1726882210.31519: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882210.31566: no more pending results, returning what we have 11728 1726882210.31570: results queue empty 11728 1726882210.31571: checking for any_errors_fatal 11728 1726882210.31578: done checking for any_errors_fatal 11728 1726882210.31579: checking for max_fail_percentage 11728 1726882210.31580: done checking for max_fail_percentage 11728 1726882210.31581: checking to see if all hosts have failed and the running result is not ok 11728 1726882210.31582: done checking to see if all hosts have failed 11728 1726882210.31583: getting the remaining hosts for this loop 11728 1726882210.31585: done getting the remaining hosts for this loop 11728 1726882210.31588: getting the next task for host managed_node3 11728 1726882210.31596: done getting next task for host managed_node3 11728 1726882210.31600: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882210.31605: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882210.31621: getting variables 11728 1726882210.31622: in VariableManager get_vars() 11728 1726882210.31660: Calling all_inventory to load vars for managed_node3 11728 1726882210.31663: Calling groups_inventory to load vars for managed_node3 11728 1726882210.31665: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882210.31673: Calling all_plugins_play to load vars for managed_node3 11728 1726882210.31675: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882210.31677: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.32416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.33296: done with get_vars() 11728 1726882210.33310: done getting variables 11728 1726882210.33348: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:30:10 -0400 (0:00:00.030) 0:00:35.186 ****** 11728 1726882210.33375: entering _queue_task() for managed_node3/package 11728 1726882210.33575: worker is 1 (out of 1 available) 11728 1726882210.33589: exiting _queue_task() for managed_node3/package 11728 1726882210.33606: done queuing things up, now waiting for results queue to drain 11728 1726882210.33607: waiting for pending results... 11728 1726882210.33789: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882210.33885: in run() - task 12673a56-9f93-5c28-a762-00000000069d 11728 1726882210.33899: variable 'ansible_search_path' from source: unknown 11728 1726882210.33903: variable 'ansible_search_path' from source: unknown 11728 1726882210.33929: calling self._execute() 11728 1726882210.34000: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.34004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.34012: variable 'omit' from source: magic vars 11728 1726882210.34284: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.34296: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.34374: variable 'network_state' from source: role '' defaults 11728 1726882210.34388: Evaluated conditional (network_state != {}): False 11728 1726882210.34391: when evaluation is False, skipping this task 11728 1726882210.34397: _execute() done 11728 1726882210.34400: dumping result to json 11728 1726882210.34402: done dumping result, returning 11728 1726882210.34405: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-5c28-a762-00000000069d] 11728 1726882210.34408: sending task result for task 12673a56-9f93-5c28-a762-00000000069d 11728 1726882210.34498: done sending task result for task 12673a56-9f93-5c28-a762-00000000069d 11728 1726882210.34501: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882210.34546: no more pending results, returning what we have 11728 1726882210.34549: results queue empty 11728 1726882210.34553: checking for any_errors_fatal 11728 1726882210.34559: done checking for any_errors_fatal 11728 1726882210.34559: checking for max_fail_percentage 11728 1726882210.34561: done checking for max_fail_percentage 11728 1726882210.34562: checking to see if all hosts have failed and the running result is not ok 11728 1726882210.34563: done checking to see if all hosts have failed 11728 1726882210.34563: getting the remaining hosts for this loop 11728 1726882210.34565: done getting the remaining hosts for this loop 11728 1726882210.34568: getting the next task for host managed_node3 11728 1726882210.34575: done getting next task for host managed_node3 11728 1726882210.34578: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882210.34583: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882210.34601: getting variables 11728 1726882210.34602: in VariableManager get_vars() 11728 1726882210.34630: Calling all_inventory to load vars for managed_node3 11728 1726882210.34633: Calling groups_inventory to load vars for managed_node3 11728 1726882210.34635: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882210.34642: Calling all_plugins_play to load vars for managed_node3 11728 1726882210.34644: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882210.34647: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.35851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.37010: done with get_vars() 11728 1726882210.37033: done getting variables 11728 1726882210.37107: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:30:10 -0400 (0:00:00.037) 0:00:35.223 ****** 11728 1726882210.37143: entering _queue_task() for managed_node3/service 11728 1726882210.37389: worker is 1 (out of 1 available) 11728 1726882210.37406: exiting _queue_task() for managed_node3/service 11728 1726882210.37419: done queuing things up, now waiting for results queue to drain 11728 1726882210.37421: waiting for pending results... 11728 1726882210.37662: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882210.37775: in run() - task 12673a56-9f93-5c28-a762-00000000069e 11728 1726882210.37790: variable 'ansible_search_path' from source: unknown 11728 1726882210.37800: variable 'ansible_search_path' from source: unknown 11728 1726882210.37832: calling self._execute() 11728 1726882210.37921: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.37925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.37928: variable 'omit' from source: magic vars 11728 1726882210.38230: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.38260: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.38392: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.38530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882210.40223: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882210.40278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882210.40309: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882210.40334: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882210.40355: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882210.40416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.40438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.40457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.40486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.40501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.40533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.40549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.40566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.40596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.40611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.40638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.40654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.40671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.40701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.40713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.40829: variable 'network_connections' from source: task vars 11728 1726882210.40838: variable 'port2_profile' from source: play vars 11728 1726882210.40885: variable 'port2_profile' from source: play vars 11728 1726882210.40895: variable 'port1_profile' from source: play vars 11728 1726882210.40943: variable 'port1_profile' from source: play vars 11728 1726882210.40950: variable 'controller_profile' from source: play vars 11728 1726882210.40991: variable 'controller_profile' from source: play vars 11728 1726882210.41045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882210.41167: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882210.41195: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882210.41224: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882210.41246: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882210.41277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882210.41296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882210.41316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.41341: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882210.41378: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882210.41530: variable 'network_connections' from source: task vars 11728 1726882210.41534: variable 'port2_profile' from source: play vars 11728 1726882210.41580: variable 'port2_profile' from source: play vars 11728 1726882210.41586: variable 'port1_profile' from source: play vars 11728 1726882210.41632: variable 'port1_profile' from source: play vars 11728 1726882210.41639: variable 'controller_profile' from source: play vars 11728 1726882210.41684: variable 'controller_profile' from source: play vars 11728 1726882210.41707: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882210.41718: when evaluation is False, skipping this task 11728 1726882210.41721: _execute() done 11728 1726882210.41724: dumping result to json 11728 1726882210.41727: done dumping result, returning 11728 1726882210.41729: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-00000000069e] 11728 1726882210.41731: sending task result for task 12673a56-9f93-5c28-a762-00000000069e 11728 1726882210.41813: done sending task result for task 12673a56-9f93-5c28-a762-00000000069e 11728 1726882210.41815: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882210.41856: no more pending results, returning what we have 11728 1726882210.41860: results queue empty 11728 1726882210.41860: checking for any_errors_fatal 11728 1726882210.41867: done checking for any_errors_fatal 11728 1726882210.41868: checking for max_fail_percentage 11728 1726882210.41870: done checking for max_fail_percentage 11728 1726882210.41870: checking to see if all hosts have failed and the running result is not ok 11728 1726882210.41871: done checking to see if all hosts have failed 11728 1726882210.41872: getting the remaining hosts for this loop 11728 1726882210.41874: done getting the remaining hosts for this loop 11728 1726882210.41880: getting the next task for host managed_node3 11728 1726882210.41886: done getting next task for host managed_node3 11728 1726882210.41890: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882210.41896: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882210.41912: getting variables 11728 1726882210.41914: in VariableManager get_vars() 11728 1726882210.41951: Calling all_inventory to load vars for managed_node3 11728 1726882210.41953: Calling groups_inventory to load vars for managed_node3 11728 1726882210.41955: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882210.41964: Calling all_plugins_play to load vars for managed_node3 11728 1726882210.41967: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882210.41970: Calling groups_plugins_play to load vars for managed_node3 11728 1726882210.42803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882210.43875: done with get_vars() 11728 1726882210.43897: done getting variables 11728 1726882210.43936: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:30:10 -0400 (0:00:00.068) 0:00:35.292 ****** 11728 1726882210.43960: entering _queue_task() for managed_node3/service 11728 1726882210.44250: worker is 1 (out of 1 available) 11728 1726882210.44264: exiting _queue_task() for managed_node3/service 11728 1726882210.44277: done queuing things up, now waiting for results queue to drain 11728 1726882210.44278: waiting for pending results... 11728 1726882210.44488: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882210.44584: in run() - task 12673a56-9f93-5c28-a762-00000000069f 11728 1726882210.44599: variable 'ansible_search_path' from source: unknown 11728 1726882210.44602: variable 'ansible_search_path' from source: unknown 11728 1726882210.44629: calling self._execute() 11728 1726882210.44702: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.44706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.44718: variable 'omit' from source: magic vars 11728 1726882210.45117: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.45127: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882210.45275: variable 'network_provider' from source: set_fact 11728 1726882210.45278: variable 'network_state' from source: role '' defaults 11728 1726882210.45288: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11728 1726882210.45298: variable 'omit' from source: magic vars 11728 1726882210.45358: variable 'omit' from source: magic vars 11728 1726882210.45387: variable 'network_service_name' from source: role '' defaults 11728 1726882210.45462: variable 'network_service_name' from source: role '' defaults 11728 1726882210.45560: variable '__network_provider_setup' from source: role '' defaults 11728 1726882210.45567: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882210.45640: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882210.45647: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882210.45707: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882210.45843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882210.47522: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882210.47576: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882210.47605: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882210.47631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882210.47655: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882210.47725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.47747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.47768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.47797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.47807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.47839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.47860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.47877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.47904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.47915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.48062: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882210.48140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.48156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.48172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.48203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.48215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.48274: variable 'ansible_python' from source: facts 11728 1726882210.48286: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882210.48345: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882210.48400: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882210.48481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.48500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.48525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.48561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.48572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.48607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882210.48628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882210.48647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.48672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882210.48682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882210.48775: variable 'network_connections' from source: task vars 11728 1726882210.48781: variable 'port2_profile' from source: play vars 11728 1726882210.48835: variable 'port2_profile' from source: play vars 11728 1726882210.48848: variable 'port1_profile' from source: play vars 11728 1726882210.48901: variable 'port1_profile' from source: play vars 11728 1726882210.48911: variable 'controller_profile' from source: play vars 11728 1726882210.48965: variable 'controller_profile' from source: play vars 11728 1726882210.49034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882210.49165: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882210.49205: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882210.49237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882210.49266: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882210.49315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882210.49336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882210.49357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882210.49379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882210.49438: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.49666: variable 'network_connections' from source: task vars 11728 1726882210.49689: variable 'port2_profile' from source: play vars 11728 1726882210.49755: variable 'port2_profile' from source: play vars 11728 1726882210.49766: variable 'port1_profile' from source: play vars 11728 1726882210.49839: variable 'port1_profile' from source: play vars 11728 1726882210.49853: variable 'controller_profile' from source: play vars 11728 1726882210.49929: variable 'controller_profile' from source: play vars 11728 1726882210.49963: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882210.50023: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882210.50217: variable 'network_connections' from source: task vars 11728 1726882210.50220: variable 'port2_profile' from source: play vars 11728 1726882210.50268: variable 'port2_profile' from source: play vars 11728 1726882210.50275: variable 'port1_profile' from source: play vars 11728 1726882210.50357: variable 'port1_profile' from source: play vars 11728 1726882210.50360: variable 'controller_profile' from source: play vars 11728 1726882210.50446: variable 'controller_profile' from source: play vars 11728 1726882210.50463: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882210.50526: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882210.50708: variable 'network_connections' from source: task vars 11728 1726882210.50711: variable 'port2_profile' from source: play vars 11728 1726882210.50763: variable 'port2_profile' from source: play vars 11728 1726882210.50769: variable 'port1_profile' from source: play vars 11728 1726882210.50819: variable 'port1_profile' from source: play vars 11728 1726882210.50826: variable 'controller_profile' from source: play vars 11728 1726882210.50877: variable 'controller_profile' from source: play vars 11728 1726882210.50916: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882210.50960: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882210.50965: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882210.51010: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882210.51162: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882210.51482: variable 'network_connections' from source: task vars 11728 1726882210.51486: variable 'port2_profile' from source: play vars 11728 1726882210.51532: variable 'port2_profile' from source: play vars 11728 1726882210.51539: variable 'port1_profile' from source: play vars 11728 1726882210.51578: variable 'port1_profile' from source: play vars 11728 1726882210.51584: variable 'controller_profile' from source: play vars 11728 1726882210.51633: variable 'controller_profile' from source: play vars 11728 1726882210.51640: variable 'ansible_distribution' from source: facts 11728 1726882210.51642: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.51648: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.51659: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882210.51775: variable 'ansible_distribution' from source: facts 11728 1726882210.51779: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.51782: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.51795: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882210.51930: variable 'ansible_distribution' from source: facts 11728 1726882210.51933: variable '__network_rh_distros' from source: role '' defaults 11728 1726882210.51936: variable 'ansible_distribution_major_version' from source: facts 11728 1726882210.51961: variable 'network_provider' from source: set_fact 11728 1726882210.51996: variable 'omit' from source: magic vars 11728 1726882210.52017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882210.52042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882210.52059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882210.52073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882210.52106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882210.52131: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882210.52135: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.52139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.52255: Set connection var ansible_connection to ssh 11728 1726882210.52258: Set connection var ansible_shell_executable to /bin/sh 11728 1726882210.52260: Set connection var ansible_timeout to 10 11728 1726882210.52264: Set connection var ansible_shell_type to sh 11728 1726882210.52275: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882210.52277: Set connection var ansible_pipelining to False 11728 1726882210.52298: variable 'ansible_shell_executable' from source: unknown 11728 1726882210.52301: variable 'ansible_connection' from source: unknown 11728 1726882210.52304: variable 'ansible_module_compression' from source: unknown 11728 1726882210.52306: variable 'ansible_shell_type' from source: unknown 11728 1726882210.52308: variable 'ansible_shell_executable' from source: unknown 11728 1726882210.52310: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882210.52312: variable 'ansible_pipelining' from source: unknown 11728 1726882210.52314: variable 'ansible_timeout' from source: unknown 11728 1726882210.52319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882210.52410: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882210.52419: variable 'omit' from source: magic vars 11728 1726882210.52425: starting attempt loop 11728 1726882210.52428: running the handler 11728 1726882210.52499: variable 'ansible_facts' from source: unknown 11728 1726882210.53061: _low_level_execute_command(): starting 11728 1726882210.53067: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882210.53614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.53644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.53682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882210.53685: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882210.53687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.53755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882210.55425: stdout chunk (state=3): >>>/root <<< 11728 1726882210.55556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882210.55559: stderr chunk (state=3): >>><<< 11728 1726882210.55564: stdout chunk (state=3): >>><<< 11728 1726882210.55586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882210.55604: _low_level_execute_command(): starting 11728 1726882210.55618: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336 `" && echo ansible-tmp-1726882210.5558958-13495-97871196092336="` echo /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336 `" ) && sleep 0' 11728 1726882210.56132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.56135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882210.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.56142: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882210.56145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.56209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882210.56213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882210.56215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.56258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882210.58149: stdout chunk (state=3): >>>ansible-tmp-1726882210.5558958-13495-97871196092336=/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336 <<< 11728 1726882210.58235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882210.58259: stderr chunk (state=3): >>><<< 11728 1726882210.58262: stdout chunk (state=3): >>><<< 11728 1726882210.58275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882210.5558958-13495-97871196092336=/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882210.58304: variable 'ansible_module_compression' from source: unknown 11728 1726882210.58343: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11728 1726882210.58406: variable 'ansible_facts' from source: unknown 11728 1726882210.58704: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py 11728 1726882210.59028: Sending initial data 11728 1726882210.59032: Sent initial data (155 bytes) 11728 1726882210.59710: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882210.59718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882210.59729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.59749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882210.59775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882210.59783: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882210.59802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.59848: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.59906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882210.59912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882210.59928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.59995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882210.61526: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882210.61571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882210.61637: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp57b8y_ed /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py <<< 11728 1726882210.61641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py" <<< 11728 1726882210.61757: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp57b8y_ed" to remote "/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py" <<< 11728 1726882210.63501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882210.63505: stdout chunk (state=3): >>><<< 11728 1726882210.63508: stderr chunk (state=3): >>><<< 11728 1726882210.63510: done transferring module to remote 11728 1726882210.63512: _low_level_execute_command(): starting 11728 1726882210.63514: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/ /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py && sleep 0' 11728 1726882210.64342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.64395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882210.64409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882210.64440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.64613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882210.66341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882210.66345: stdout chunk (state=3): >>><<< 11728 1726882210.66348: stderr chunk (state=3): >>><<< 11728 1726882210.66486: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882210.66575: _low_level_execute_command(): starting 11728 1726882210.66579: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/AnsiballZ_systemd.py && sleep 0' 11728 1726882210.67110: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882210.67135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882210.67151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.67175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882210.67208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882210.67286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.67412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882210.67863: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.67909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882210.96629: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10412032", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311915008", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "667949000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11728 1726882210.96689: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11728 1726882210.98510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882210.98522: stdout chunk (state=3): >>><<< 11728 1726882210.98524: stderr chunk (state=3): >>><<< 11728 1726882210.98631: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10412032", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3311915008", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "667949000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882210.98773: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882210.98809: _low_level_execute_command(): starting 11728 1726882210.98813: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882210.5558958-13495-97871196092336/ > /dev/null 2>&1 && sleep 0' 11728 1726882210.99231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.99234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.99237: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882210.99239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882210.99286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882210.99289: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882210.99335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882211.01699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882211.01703: stderr chunk (state=3): >>><<< 11728 1726882211.01708: stdout chunk (state=3): >>><<< 11728 1726882211.01730: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882211.01742: handler run complete 11728 1726882211.02099: attempt loop complete, returning result 11728 1726882211.02103: _execute() done 11728 1726882211.02105: dumping result to json 11728 1726882211.02107: done dumping result, returning 11728 1726882211.02110: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-5c28-a762-00000000069f] 11728 1726882211.02112: sending task result for task 12673a56-9f93-5c28-a762-00000000069f 11728 1726882211.02480: done sending task result for task 12673a56-9f93-5c28-a762-00000000069f 11728 1726882211.02483: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882211.02542: no more pending results, returning what we have 11728 1726882211.02546: results queue empty 11728 1726882211.02547: checking for any_errors_fatal 11728 1726882211.02555: done checking for any_errors_fatal 11728 1726882211.02556: checking for max_fail_percentage 11728 1726882211.02558: done checking for max_fail_percentage 11728 1726882211.02558: checking to see if all hosts have failed and the running result is not ok 11728 1726882211.02559: done checking to see if all hosts have failed 11728 1726882211.02560: getting the remaining hosts for this loop 11728 1726882211.02562: done getting the remaining hosts for this loop 11728 1726882211.02565: getting the next task for host managed_node3 11728 1726882211.02573: done getting next task for host managed_node3 11728 1726882211.02576: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882211.02582: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882211.02803: getting variables 11728 1726882211.02805: in VariableManager get_vars() 11728 1726882211.02845: Calling all_inventory to load vars for managed_node3 11728 1726882211.02848: Calling groups_inventory to load vars for managed_node3 11728 1726882211.02851: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882211.02861: Calling all_plugins_play to load vars for managed_node3 11728 1726882211.02864: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882211.02868: Calling groups_plugins_play to load vars for managed_node3 11728 1726882211.04955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882211.06534: done with get_vars() 11728 1726882211.06757: done getting variables 11728 1726882211.06826: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:30:11 -0400 (0:00:00.629) 0:00:35.921 ****** 11728 1726882211.06870: entering _queue_task() for managed_node3/service 11728 1726882211.07234: worker is 1 (out of 1 available) 11728 1726882211.07246: exiting _queue_task() for managed_node3/service 11728 1726882211.07257: done queuing things up, now waiting for results queue to drain 11728 1726882211.07259: waiting for pending results... 11728 1726882211.07712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882211.07720: in run() - task 12673a56-9f93-5c28-a762-0000000006a0 11728 1726882211.07739: variable 'ansible_search_path' from source: unknown 11728 1726882211.07746: variable 'ansible_search_path' from source: unknown 11728 1726882211.07786: calling self._execute() 11728 1726882211.07883: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.07897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.07951: variable 'omit' from source: magic vars 11728 1726882211.08404: variable 'ansible_distribution_major_version' from source: facts 11728 1726882211.08458: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882211.08549: variable 'network_provider' from source: set_fact 11728 1726882211.08562: Evaluated conditional (network_provider == "nm"): True 11728 1726882211.08656: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882211.08749: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882211.08921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882211.11590: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882211.11699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882211.11739: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882211.11797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882211.11833: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882211.11907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882211.11945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882211.12099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882211.12102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882211.12104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882211.12404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882211.12408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882211.12411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882211.12413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882211.12416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882211.12418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882211.12525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882211.12555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882211.12662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882211.12682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882211.13018: variable 'network_connections' from source: task vars 11728 1726882211.13046: variable 'port2_profile' from source: play vars 11728 1726882211.13149: variable 'port2_profile' from source: play vars 11728 1726882211.13269: variable 'port1_profile' from source: play vars 11728 1726882211.13401: variable 'port1_profile' from source: play vars 11728 1726882211.13404: variable 'controller_profile' from source: play vars 11728 1726882211.13453: variable 'controller_profile' from source: play vars 11728 1726882211.13574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882211.13712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882211.13754: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882211.13777: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882211.13802: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882211.13841: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882211.13861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882211.13886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882211.13910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882211.13952: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882211.14115: variable 'network_connections' from source: task vars 11728 1726882211.14119: variable 'port2_profile' from source: play vars 11728 1726882211.14170: variable 'port2_profile' from source: play vars 11728 1726882211.14176: variable 'port1_profile' from source: play vars 11728 1726882211.14221: variable 'port1_profile' from source: play vars 11728 1726882211.14233: variable 'controller_profile' from source: play vars 11728 1726882211.14271: variable 'controller_profile' from source: play vars 11728 1726882211.14298: Evaluated conditional (__network_wpa_supplicant_required): False 11728 1726882211.14303: when evaluation is False, skipping this task 11728 1726882211.14306: _execute() done 11728 1726882211.14308: dumping result to json 11728 1726882211.14312: done dumping result, returning 11728 1726882211.14320: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-5c28-a762-0000000006a0] 11728 1726882211.14324: sending task result for task 12673a56-9f93-5c28-a762-0000000006a0 11728 1726882211.14412: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a0 11728 1726882211.14415: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11728 1726882211.14462: no more pending results, returning what we have 11728 1726882211.14466: results queue empty 11728 1726882211.14467: checking for any_errors_fatal 11728 1726882211.14485: done checking for any_errors_fatal 11728 1726882211.14485: checking for max_fail_percentage 11728 1726882211.14487: done checking for max_fail_percentage 11728 1726882211.14488: checking to see if all hosts have failed and the running result is not ok 11728 1726882211.14488: done checking to see if all hosts have failed 11728 1726882211.14489: getting the remaining hosts for this loop 11728 1726882211.14491: done getting the remaining hosts for this loop 11728 1726882211.14496: getting the next task for host managed_node3 11728 1726882211.14504: done getting next task for host managed_node3 11728 1726882211.14508: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882211.14512: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882211.14529: getting variables 11728 1726882211.14531: in VariableManager get_vars() 11728 1726882211.14569: Calling all_inventory to load vars for managed_node3 11728 1726882211.14571: Calling groups_inventory to load vars for managed_node3 11728 1726882211.14573: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882211.14582: Calling all_plugins_play to load vars for managed_node3 11728 1726882211.14584: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882211.14586: Calling groups_plugins_play to load vars for managed_node3 11728 1726882211.15707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882211.18765: done with get_vars() 11728 1726882211.18822: done getting variables 11728 1726882211.18888: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:30:11 -0400 (0:00:00.120) 0:00:36.041 ****** 11728 1726882211.18930: entering _queue_task() for managed_node3/service 11728 1726882211.19304: worker is 1 (out of 1 available) 11728 1726882211.19319: exiting _queue_task() for managed_node3/service 11728 1726882211.19332: done queuing things up, now waiting for results queue to drain 11728 1726882211.19334: waiting for pending results... 11728 1726882211.19635: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882211.19827: in run() - task 12673a56-9f93-5c28-a762-0000000006a1 11728 1726882211.19831: variable 'ansible_search_path' from source: unknown 11728 1726882211.19834: variable 'ansible_search_path' from source: unknown 11728 1726882211.19858: calling self._execute() 11728 1726882211.19964: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.19976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.19991: variable 'omit' from source: magic vars 11728 1726882211.20500: variable 'ansible_distribution_major_version' from source: facts 11728 1726882211.20504: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882211.20595: variable 'network_provider' from source: set_fact 11728 1726882211.20612: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882211.20620: when evaluation is False, skipping this task 11728 1726882211.20627: _execute() done 11728 1726882211.20634: dumping result to json 11728 1726882211.20641: done dumping result, returning 11728 1726882211.20652: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-5c28-a762-0000000006a1] 11728 1726882211.20663: sending task result for task 12673a56-9f93-5c28-a762-0000000006a1 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882211.20942: no more pending results, returning what we have 11728 1726882211.20947: results queue empty 11728 1726882211.20948: checking for any_errors_fatal 11728 1726882211.20956: done checking for any_errors_fatal 11728 1726882211.20957: checking for max_fail_percentage 11728 1726882211.20960: done checking for max_fail_percentage 11728 1726882211.20961: checking to see if all hosts have failed and the running result is not ok 11728 1726882211.20961: done checking to see if all hosts have failed 11728 1726882211.20962: getting the remaining hosts for this loop 11728 1726882211.20964: done getting the remaining hosts for this loop 11728 1726882211.20968: getting the next task for host managed_node3 11728 1726882211.20976: done getting next task for host managed_node3 11728 1726882211.20981: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882211.20986: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882211.21008: getting variables 11728 1726882211.21010: in VariableManager get_vars() 11728 1726882211.21055: Calling all_inventory to load vars for managed_node3 11728 1726882211.21059: Calling groups_inventory to load vars for managed_node3 11728 1726882211.21062: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882211.21074: Calling all_plugins_play to load vars for managed_node3 11728 1726882211.21077: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882211.21081: Calling groups_plugins_play to load vars for managed_node3 11728 1726882211.21657: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a1 11728 1726882211.21662: WORKER PROCESS EXITING 11728 1726882211.23463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882211.24954: done with get_vars() 11728 1726882211.24975: done getting variables 11728 1726882211.25032: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:30:11 -0400 (0:00:00.061) 0:00:36.103 ****** 11728 1726882211.25071: entering _queue_task() for managed_node3/copy 11728 1726882211.25620: worker is 1 (out of 1 available) 11728 1726882211.25628: exiting _queue_task() for managed_node3/copy 11728 1726882211.25637: done queuing things up, now waiting for results queue to drain 11728 1726882211.25638: waiting for pending results... 11728 1726882211.25696: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882211.25845: in run() - task 12673a56-9f93-5c28-a762-0000000006a2 11728 1726882211.25867: variable 'ansible_search_path' from source: unknown 11728 1726882211.25873: variable 'ansible_search_path' from source: unknown 11728 1726882211.25912: calling self._execute() 11728 1726882211.26009: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.26019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.26033: variable 'omit' from source: magic vars 11728 1726882211.26388: variable 'ansible_distribution_major_version' from source: facts 11728 1726882211.26413: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882211.26526: variable 'network_provider' from source: set_fact 11728 1726882211.26535: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882211.26541: when evaluation is False, skipping this task 11728 1726882211.26546: _execute() done 11728 1726882211.26552: dumping result to json 11728 1726882211.26557: done dumping result, returning 11728 1726882211.26567: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-5c28-a762-0000000006a2] 11728 1726882211.26578: sending task result for task 12673a56-9f93-5c28-a762-0000000006a2 11728 1726882211.26899: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a2 11728 1726882211.26902: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11728 1726882211.26936: no more pending results, returning what we have 11728 1726882211.26938: results queue empty 11728 1726882211.26939: checking for any_errors_fatal 11728 1726882211.26944: done checking for any_errors_fatal 11728 1726882211.26945: checking for max_fail_percentage 11728 1726882211.26946: done checking for max_fail_percentage 11728 1726882211.26947: checking to see if all hosts have failed and the running result is not ok 11728 1726882211.26947: done checking to see if all hosts have failed 11728 1726882211.26948: getting the remaining hosts for this loop 11728 1726882211.26949: done getting the remaining hosts for this loop 11728 1726882211.26952: getting the next task for host managed_node3 11728 1726882211.26958: done getting next task for host managed_node3 11728 1726882211.26961: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882211.26965: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882211.26980: getting variables 11728 1726882211.26981: in VariableManager get_vars() 11728 1726882211.27013: Calling all_inventory to load vars for managed_node3 11728 1726882211.27016: Calling groups_inventory to load vars for managed_node3 11728 1726882211.27018: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882211.27027: Calling all_plugins_play to load vars for managed_node3 11728 1726882211.27030: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882211.27033: Calling groups_plugins_play to load vars for managed_node3 11728 1726882211.28438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882211.29949: done with get_vars() 11728 1726882211.29971: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:30:11 -0400 (0:00:00.049) 0:00:36.153 ****** 11728 1726882211.30060: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882211.30363: worker is 1 (out of 1 available) 11728 1726882211.30377: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882211.30389: done queuing things up, now waiting for results queue to drain 11728 1726882211.30390: waiting for pending results... 11728 1726882211.30674: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882211.30838: in run() - task 12673a56-9f93-5c28-a762-0000000006a3 11728 1726882211.30859: variable 'ansible_search_path' from source: unknown 11728 1726882211.30866: variable 'ansible_search_path' from source: unknown 11728 1726882211.30908: calling self._execute() 11728 1726882211.31002: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.31014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.31030: variable 'omit' from source: magic vars 11728 1726882211.31408: variable 'ansible_distribution_major_version' from source: facts 11728 1726882211.31425: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882211.31437: variable 'omit' from source: magic vars 11728 1726882211.31528: variable 'omit' from source: magic vars 11728 1726882211.31801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882211.33789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882211.33858: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882211.33906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882211.33944: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882211.33980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882211.34065: variable 'network_provider' from source: set_fact 11728 1726882211.34204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882211.34237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882211.34268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882211.34321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882211.34352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882211.34438: variable 'omit' from source: magic vars 11728 1726882211.34551: variable 'omit' from source: magic vars 11728 1726882211.34663: variable 'network_connections' from source: task vars 11728 1726882211.34681: variable 'port2_profile' from source: play vars 11728 1726882211.34750: variable 'port2_profile' from source: play vars 11728 1726882211.34799: variable 'port1_profile' from source: play vars 11728 1726882211.34831: variable 'port1_profile' from source: play vars 11728 1726882211.34848: variable 'controller_profile' from source: play vars 11728 1726882211.34911: variable 'controller_profile' from source: play vars 11728 1726882211.35081: variable 'omit' from source: magic vars 11728 1726882211.35097: variable '__lsr_ansible_managed' from source: task vars 11728 1726882211.35170: variable '__lsr_ansible_managed' from source: task vars 11728 1726882211.35356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11728 1726882211.35606: Loaded config def from plugin (lookup/template) 11728 1726882211.35610: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11728 1726882211.35619: File lookup term: get_ansible_managed.j2 11728 1726882211.35628: variable 'ansible_search_path' from source: unknown 11728 1726882211.35638: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11728 1726882211.35656: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11728 1726882211.35799: variable 'ansible_search_path' from source: unknown 11728 1726882211.42096: variable 'ansible_managed' from source: unknown 11728 1726882211.42250: variable 'omit' from source: magic vars 11728 1726882211.42284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882211.42317: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882211.42338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882211.42365: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882211.42381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882211.42417: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882211.42426: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.42434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.42528: Set connection var ansible_connection to ssh 11728 1726882211.42544: Set connection var ansible_shell_executable to /bin/sh 11728 1726882211.42556: Set connection var ansible_timeout to 10 11728 1726882211.42696: Set connection var ansible_shell_type to sh 11728 1726882211.42701: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882211.42704: Set connection var ansible_pipelining to False 11728 1726882211.42706: variable 'ansible_shell_executable' from source: unknown 11728 1726882211.42708: variable 'ansible_connection' from source: unknown 11728 1726882211.42711: variable 'ansible_module_compression' from source: unknown 11728 1726882211.42713: variable 'ansible_shell_type' from source: unknown 11728 1726882211.42715: variable 'ansible_shell_executable' from source: unknown 11728 1726882211.42717: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882211.42719: variable 'ansible_pipelining' from source: unknown 11728 1726882211.42721: variable 'ansible_timeout' from source: unknown 11728 1726882211.42731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882211.42800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882211.42847: variable 'omit' from source: magic vars 11728 1726882211.42850: starting attempt loop 11728 1726882211.42852: running the handler 11728 1726882211.42854: _low_level_execute_command(): starting 11728 1726882211.42865: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882211.43582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882211.43707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.43716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882211.43723: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882211.43737: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882211.43815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882211.45484: stdout chunk (state=3): >>>/root <<< 11728 1726882211.45638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882211.45641: stdout chunk (state=3): >>><<< 11728 1726882211.45643: stderr chunk (state=3): >>><<< 11728 1726882211.45662: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882211.45677: _low_level_execute_command(): starting 11728 1726882211.45686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174 `" && echo ansible-tmp-1726882211.4566805-13561-146352738053174="` echo /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174 `" ) && sleep 0' 11728 1726882211.46269: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882211.46282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.46304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882211.46346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882211.46361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882211.46371: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882211.46383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882211.46410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882211.46426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882211.46441: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882211.46456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.46548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882211.46590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882211.46671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882211.48741: stdout chunk (state=3): >>>ansible-tmp-1726882211.4566805-13561-146352738053174=/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174 <<< 11728 1726882211.48797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882211.48808: stdout chunk (state=3): >>><<< 11728 1726882211.48822: stderr chunk (state=3): >>><<< 11728 1726882211.48845: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882211.4566805-13561-146352738053174=/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882211.48900: variable 'ansible_module_compression' from source: unknown 11728 1726882211.49200: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11728 1726882211.49209: variable 'ansible_facts' from source: unknown 11728 1726882211.49295: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py 11728 1726882211.49755: Sending initial data 11728 1726882211.49764: Sent initial data (168 bytes) 11728 1726882211.50729: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.50809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882211.50935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882211.50954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882211.51006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882211.51099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882211.52580: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882211.52621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882211.52663: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py" <<< 11728 1726882211.52814: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpu7kjpthr /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py <<< 11728 1726882211.52843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpu7kjpthr" to remote "/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py" <<< 11728 1726882211.54287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882211.54352: stderr chunk (state=3): >>><<< 11728 1726882211.54355: stdout chunk (state=3): >>><<< 11728 1726882211.54378: done transferring module to remote 11728 1726882211.54459: _low_level_execute_command(): starting 11728 1726882211.54463: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/ /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py && sleep 0' 11728 1726882211.55632: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.55636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882211.55685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882211.55688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882211.55691: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882211.55698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882211.55747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882211.55753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882211.55770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882211.55842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882211.57602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882211.57606: stdout chunk (state=3): >>><<< 11728 1726882211.57608: stderr chunk (state=3): >>><<< 11728 1726882211.57723: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882211.57727: _low_level_execute_command(): starting 11728 1726882211.57729: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/AnsiballZ_network_connections.py && sleep 0' 11728 1726882211.58191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882211.58204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.58215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882211.58229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882211.58241: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882211.58249: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882211.58257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882211.58271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882211.58287: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882211.58291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882211.58295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882211.58315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882211.58424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882211.58427: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882211.58430: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882211.58432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882211.58434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882211.58436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882211.58464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882211.58540: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.14232: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/595a8ae5-0b4e-4403-aa9d-4e632858ba4c: error=unknown <<< 11728 1726882212.15967: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/7846a1e6-a3d1-419f-996a-824f28a6a5c0: error=unknown <<< 11728 1726882212.17689: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11728 1726882212.17718: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a9f108ff-93a0-4692-a961-7fb7246e6129: error=unknown <<< 11728 1726882212.17910: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11728 1726882212.19801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882212.19814: stdout chunk (state=3): >>><<< 11728 1726882212.19826: stderr chunk (state=3): >>><<< 11728 1726882212.19856: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/595a8ae5-0b4e-4403-aa9d-4e632858ba4c: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/7846a1e6-a3d1-419f-996a-824f28a6a5c0: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_hbz4dov5/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a9f108ff-93a0-4692-a961-7fb7246e6129: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882212.19918: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882212.20010: _low_level_execute_command(): starting 11728 1726882212.20014: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882211.4566805-13561-146352738053174/ > /dev/null 2>&1 && sleep 0' 11728 1726882212.20750: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882212.20765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882212.20780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.20802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882212.20837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882212.20911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.20945: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.20972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882212.20996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.21073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.22946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.22949: stdout chunk (state=3): >>><<< 11728 1726882212.22952: stderr chunk (state=3): >>><<< 11728 1726882212.23119: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.23123: handler run complete 11728 1726882212.23125: attempt loop complete, returning result 11728 1726882212.23127: _execute() done 11728 1726882212.23129: dumping result to json 11728 1726882212.23131: done dumping result, returning 11728 1726882212.23133: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-5c28-a762-0000000006a3] 11728 1726882212.23134: sending task result for task 12673a56-9f93-5c28-a762-0000000006a3 11728 1726882212.23431: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a3 11728 1726882212.23434: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11728 1726882212.23553: no more pending results, returning what we have 11728 1726882212.23557: results queue empty 11728 1726882212.23558: checking for any_errors_fatal 11728 1726882212.23565: done checking for any_errors_fatal 11728 1726882212.23566: checking for max_fail_percentage 11728 1726882212.23567: done checking for max_fail_percentage 11728 1726882212.23568: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.23569: done checking to see if all hosts have failed 11728 1726882212.23570: getting the remaining hosts for this loop 11728 1726882212.23572: done getting the remaining hosts for this loop 11728 1726882212.23575: getting the next task for host managed_node3 11728 1726882212.23582: done getting next task for host managed_node3 11728 1726882212.23585: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882212.23590: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.23610: getting variables 11728 1726882212.23616: in VariableManager get_vars() 11728 1726882212.23655: Calling all_inventory to load vars for managed_node3 11728 1726882212.23657: Calling groups_inventory to load vars for managed_node3 11728 1726882212.23660: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.23670: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.23673: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.23675: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.25455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.27959: done with get_vars() 11728 1726882212.27982: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:30:12 -0400 (0:00:00.980) 0:00:37.133 ****** 11728 1726882212.28077: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882212.28452: worker is 1 (out of 1 available) 11728 1726882212.28465: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882212.28478: done queuing things up, now waiting for results queue to drain 11728 1726882212.28479: waiting for pending results... 11728 1726882212.28763: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882212.28920: in run() - task 12673a56-9f93-5c28-a762-0000000006a4 11728 1726882212.28946: variable 'ansible_search_path' from source: unknown 11728 1726882212.28955: variable 'ansible_search_path' from source: unknown 11728 1726882212.28998: calling self._execute() 11728 1726882212.29104: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.29115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.29139: variable 'omit' from source: magic vars 11728 1726882212.29545: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.29572: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.29779: variable 'network_state' from source: role '' defaults 11728 1726882212.29784: Evaluated conditional (network_state != {}): False 11728 1726882212.29786: when evaluation is False, skipping this task 11728 1726882212.29788: _execute() done 11728 1726882212.29790: dumping result to json 11728 1726882212.29794: done dumping result, returning 11728 1726882212.29797: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-5c28-a762-0000000006a4] 11728 1726882212.29799: sending task result for task 12673a56-9f93-5c28-a762-0000000006a4 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882212.29928: no more pending results, returning what we have 11728 1726882212.29933: results queue empty 11728 1726882212.29934: checking for any_errors_fatal 11728 1726882212.29948: done checking for any_errors_fatal 11728 1726882212.29949: checking for max_fail_percentage 11728 1726882212.29951: done checking for max_fail_percentage 11728 1726882212.29952: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.29953: done checking to see if all hosts have failed 11728 1726882212.29953: getting the remaining hosts for this loop 11728 1726882212.29955: done getting the remaining hosts for this loop 11728 1726882212.29958: getting the next task for host managed_node3 11728 1726882212.29967: done getting next task for host managed_node3 11728 1726882212.29972: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882212.29977: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.30107: getting variables 11728 1726882212.30109: in VariableManager get_vars() 11728 1726882212.30148: Calling all_inventory to load vars for managed_node3 11728 1726882212.30151: Calling groups_inventory to load vars for managed_node3 11728 1726882212.30153: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.30166: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.30169: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.30173: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.30807: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a4 11728 1726882212.30811: WORKER PROCESS EXITING 11728 1726882212.31759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.33412: done with get_vars() 11728 1726882212.33437: done getting variables 11728 1726882212.33508: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:30:12 -0400 (0:00:00.054) 0:00:37.187 ****** 11728 1726882212.33548: entering _queue_task() for managed_node3/debug 11728 1726882212.34010: worker is 1 (out of 1 available) 11728 1726882212.34021: exiting _queue_task() for managed_node3/debug 11728 1726882212.34032: done queuing things up, now waiting for results queue to drain 11728 1726882212.34033: waiting for pending results... 11728 1726882212.34225: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882212.34455: in run() - task 12673a56-9f93-5c28-a762-0000000006a5 11728 1726882212.34459: variable 'ansible_search_path' from source: unknown 11728 1726882212.34461: variable 'ansible_search_path' from source: unknown 11728 1726882212.34471: calling self._execute() 11728 1726882212.34572: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.34588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.34606: variable 'omit' from source: magic vars 11728 1726882212.34988: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.35020: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.35099: variable 'omit' from source: magic vars 11728 1726882212.35111: variable 'omit' from source: magic vars 11728 1726882212.35155: variable 'omit' from source: magic vars 11728 1726882212.35199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882212.35299: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882212.35303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882212.35306: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.35310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.35353: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882212.35362: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.35369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.35476: Set connection var ansible_connection to ssh 11728 1726882212.35494: Set connection var ansible_shell_executable to /bin/sh 11728 1726882212.35507: Set connection var ansible_timeout to 10 11728 1726882212.35514: Set connection var ansible_shell_type to sh 11728 1726882212.35543: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882212.35546: Set connection var ansible_pipelining to False 11728 1726882212.35571: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.35578: variable 'ansible_connection' from source: unknown 11728 1726882212.35653: variable 'ansible_module_compression' from source: unknown 11728 1726882212.35656: variable 'ansible_shell_type' from source: unknown 11728 1726882212.35658: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.35662: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.35664: variable 'ansible_pipelining' from source: unknown 11728 1726882212.35666: variable 'ansible_timeout' from source: unknown 11728 1726882212.35668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.35784: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882212.35868: variable 'omit' from source: magic vars 11728 1726882212.35872: starting attempt loop 11728 1726882212.35874: running the handler 11728 1726882212.35945: variable '__network_connections_result' from source: set_fact 11728 1726882212.36012: handler run complete 11728 1726882212.36034: attempt loop complete, returning result 11728 1726882212.36040: _execute() done 11728 1726882212.36045: dumping result to json 11728 1726882212.36051: done dumping result, returning 11728 1726882212.36063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-5c28-a762-0000000006a5] 11728 1726882212.36074: sending task result for task 12673a56-9f93-5c28-a762-0000000006a5 11728 1726882212.36263: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a5 11728 1726882212.36266: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 11728 1726882212.36342: no more pending results, returning what we have 11728 1726882212.36347: results queue empty 11728 1726882212.36348: checking for any_errors_fatal 11728 1726882212.36357: done checking for any_errors_fatal 11728 1726882212.36358: checking for max_fail_percentage 11728 1726882212.36360: done checking for max_fail_percentage 11728 1726882212.36361: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.36362: done checking to see if all hosts have failed 11728 1726882212.36363: getting the remaining hosts for this loop 11728 1726882212.36364: done getting the remaining hosts for this loop 11728 1726882212.36368: getting the next task for host managed_node3 11728 1726882212.36376: done getting next task for host managed_node3 11728 1726882212.36379: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882212.36384: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.36402: getting variables 11728 1726882212.36404: in VariableManager get_vars() 11728 1726882212.36613: Calling all_inventory to load vars for managed_node3 11728 1726882212.36616: Calling groups_inventory to load vars for managed_node3 11728 1726882212.36618: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.36627: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.36630: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.36633: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.37775: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.38624: done with get_vars() 11728 1726882212.38639: done getting variables 11728 1726882212.38681: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:30:12 -0400 (0:00:00.051) 0:00:37.239 ****** 11728 1726882212.38716: entering _queue_task() for managed_node3/debug 11728 1726882212.39015: worker is 1 (out of 1 available) 11728 1726882212.39028: exiting _queue_task() for managed_node3/debug 11728 1726882212.39040: done queuing things up, now waiting for results queue to drain 11728 1726882212.39041: waiting for pending results... 11728 1726882212.39516: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882212.39521: in run() - task 12673a56-9f93-5c28-a762-0000000006a6 11728 1726882212.39524: variable 'ansible_search_path' from source: unknown 11728 1726882212.39527: variable 'ansible_search_path' from source: unknown 11728 1726882212.39553: calling self._execute() 11728 1726882212.39656: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.39668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.39682: variable 'omit' from source: magic vars 11728 1726882212.40209: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.40220: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.40226: variable 'omit' from source: magic vars 11728 1726882212.40280: variable 'omit' from source: magic vars 11728 1726882212.40307: variable 'omit' from source: magic vars 11728 1726882212.40354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882212.40389: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882212.40405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882212.40419: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.40429: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.40453: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882212.40456: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.40459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.40529: Set connection var ansible_connection to ssh 11728 1726882212.40537: Set connection var ansible_shell_executable to /bin/sh 11728 1726882212.40542: Set connection var ansible_timeout to 10 11728 1726882212.40545: Set connection var ansible_shell_type to sh 11728 1726882212.40551: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882212.40556: Set connection var ansible_pipelining to False 11728 1726882212.40573: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.40576: variable 'ansible_connection' from source: unknown 11728 1726882212.40579: variable 'ansible_module_compression' from source: unknown 11728 1726882212.40581: variable 'ansible_shell_type' from source: unknown 11728 1726882212.40583: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.40587: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.40589: variable 'ansible_pipelining' from source: unknown 11728 1726882212.40591: variable 'ansible_timeout' from source: unknown 11728 1726882212.40603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.40697: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882212.40710: variable 'omit' from source: magic vars 11728 1726882212.40714: starting attempt loop 11728 1726882212.40717: running the handler 11728 1726882212.40753: variable '__network_connections_result' from source: set_fact 11728 1726882212.40809: variable '__network_connections_result' from source: set_fact 11728 1726882212.40900: handler run complete 11728 1726882212.40918: attempt loop complete, returning result 11728 1726882212.40922: _execute() done 11728 1726882212.40925: dumping result to json 11728 1726882212.40927: done dumping result, returning 11728 1726882212.40942: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-5c28-a762-0000000006a6] 11728 1726882212.40945: sending task result for task 12673a56-9f93-5c28-a762-0000000006a6 11728 1726882212.41034: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a6 11728 1726882212.41039: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11728 1726882212.41134: no more pending results, returning what we have 11728 1726882212.41138: results queue empty 11728 1726882212.41139: checking for any_errors_fatal 11728 1726882212.41145: done checking for any_errors_fatal 11728 1726882212.41146: checking for max_fail_percentage 11728 1726882212.41147: done checking for max_fail_percentage 11728 1726882212.41148: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.41149: done checking to see if all hosts have failed 11728 1726882212.41154: getting the remaining hosts for this loop 11728 1726882212.41156: done getting the remaining hosts for this loop 11728 1726882212.41159: getting the next task for host managed_node3 11728 1726882212.41165: done getting next task for host managed_node3 11728 1726882212.41169: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882212.41173: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.41183: getting variables 11728 1726882212.41184: in VariableManager get_vars() 11728 1726882212.41218: Calling all_inventory to load vars for managed_node3 11728 1726882212.41226: Calling groups_inventory to load vars for managed_node3 11728 1726882212.41228: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.41236: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.41238: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.41241: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.42015: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.42883: done with get_vars() 11728 1726882212.42905: done getting variables 11728 1726882212.42945: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:30:12 -0400 (0:00:00.042) 0:00:37.282 ****** 11728 1726882212.42970: entering _queue_task() for managed_node3/debug 11728 1726882212.43213: worker is 1 (out of 1 available) 11728 1726882212.43226: exiting _queue_task() for managed_node3/debug 11728 1726882212.43238: done queuing things up, now waiting for results queue to drain 11728 1726882212.43240: waiting for pending results... 11728 1726882212.43417: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882212.43513: in run() - task 12673a56-9f93-5c28-a762-0000000006a7 11728 1726882212.43524: variable 'ansible_search_path' from source: unknown 11728 1726882212.43527: variable 'ansible_search_path' from source: unknown 11728 1726882212.43553: calling self._execute() 11728 1726882212.43638: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.43642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.43651: variable 'omit' from source: magic vars 11728 1726882212.43945: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.43948: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.44300: variable 'network_state' from source: role '' defaults 11728 1726882212.44303: Evaluated conditional (network_state != {}): False 11728 1726882212.44305: when evaluation is False, skipping this task 11728 1726882212.44307: _execute() done 11728 1726882212.44309: dumping result to json 11728 1726882212.44311: done dumping result, returning 11728 1726882212.44313: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-5c28-a762-0000000006a7] 11728 1726882212.44315: sending task result for task 12673a56-9f93-5c28-a762-0000000006a7 11728 1726882212.44373: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a7 11728 1726882212.44375: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11728 1726882212.44440: no more pending results, returning what we have 11728 1726882212.44444: results queue empty 11728 1726882212.44445: checking for any_errors_fatal 11728 1726882212.44452: done checking for any_errors_fatal 11728 1726882212.44453: checking for max_fail_percentage 11728 1726882212.44454: done checking for max_fail_percentage 11728 1726882212.44456: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.44456: done checking to see if all hosts have failed 11728 1726882212.44457: getting the remaining hosts for this loop 11728 1726882212.44459: done getting the remaining hosts for this loop 11728 1726882212.44462: getting the next task for host managed_node3 11728 1726882212.44468: done getting next task for host managed_node3 11728 1726882212.44479: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882212.44485: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.44512: getting variables 11728 1726882212.44514: in VariableManager get_vars() 11728 1726882212.44545: Calling all_inventory to load vars for managed_node3 11728 1726882212.44547: Calling groups_inventory to load vars for managed_node3 11728 1726882212.44549: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.44557: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.44560: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.44563: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.45835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.46690: done with get_vars() 11728 1726882212.46709: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:30:12 -0400 (0:00:00.038) 0:00:37.320 ****** 11728 1726882212.46779: entering _queue_task() for managed_node3/ping 11728 1726882212.47028: worker is 1 (out of 1 available) 11728 1726882212.47042: exiting _queue_task() for managed_node3/ping 11728 1726882212.47055: done queuing things up, now waiting for results queue to drain 11728 1726882212.47057: waiting for pending results... 11728 1726882212.47298: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882212.47602: in run() - task 12673a56-9f93-5c28-a762-0000000006a8 11728 1726882212.47606: variable 'ansible_search_path' from source: unknown 11728 1726882212.47611: variable 'ansible_search_path' from source: unknown 11728 1726882212.47614: calling self._execute() 11728 1726882212.47617: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.47619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.47622: variable 'omit' from source: magic vars 11728 1726882212.47957: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.47962: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.47970: variable 'omit' from source: magic vars 11728 1726882212.48043: variable 'omit' from source: magic vars 11728 1726882212.48079: variable 'omit' from source: magic vars 11728 1726882212.48119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882212.48153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882212.48177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882212.48198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.48214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.48239: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882212.48242: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.48245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.48343: Set connection var ansible_connection to ssh 11728 1726882212.48352: Set connection var ansible_shell_executable to /bin/sh 11728 1726882212.48363: Set connection var ansible_timeout to 10 11728 1726882212.48366: Set connection var ansible_shell_type to sh 11728 1726882212.48369: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882212.48371: Set connection var ansible_pipelining to False 11728 1726882212.48395: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.48401: variable 'ansible_connection' from source: unknown 11728 1726882212.48404: variable 'ansible_module_compression' from source: unknown 11728 1726882212.48407: variable 'ansible_shell_type' from source: unknown 11728 1726882212.48409: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.48411: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.48416: variable 'ansible_pipelining' from source: unknown 11728 1726882212.48418: variable 'ansible_timeout' from source: unknown 11728 1726882212.48423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.48619: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882212.48630: variable 'omit' from source: magic vars 11728 1726882212.48635: starting attempt loop 11728 1726882212.48637: running the handler 11728 1726882212.48698: _low_level_execute_command(): starting 11728 1726882212.48702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882212.49313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.49344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.49363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882212.49377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.49473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.51132: stdout chunk (state=3): >>>/root <<< 11728 1726882212.51234: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.51262: stderr chunk (state=3): >>><<< 11728 1726882212.51266: stdout chunk (state=3): >>><<< 11728 1726882212.51288: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.51304: _low_level_execute_command(): starting 11728 1726882212.51313: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705 `" && echo ansible-tmp-1726882212.512881-13619-148828518848705="` echo /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705 `" ) && sleep 0' 11728 1726882212.51798: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.51861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882212.51884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.51973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.53814: stdout chunk (state=3): >>>ansible-tmp-1726882212.512881-13619-148828518848705=/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705 <<< 11728 1726882212.53923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.53949: stderr chunk (state=3): >>><<< 11728 1726882212.53953: stdout chunk (state=3): >>><<< 11728 1726882212.53968: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882212.512881-13619-148828518848705=/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.54009: variable 'ansible_module_compression' from source: unknown 11728 1726882212.54042: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11728 1726882212.54075: variable 'ansible_facts' from source: unknown 11728 1726882212.54126: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py 11728 1726882212.54226: Sending initial data 11728 1726882212.54230: Sent initial data (152 bytes) 11728 1726882212.54687: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882212.54781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.54799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882212.54832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.54887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.56400: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882212.56448: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882212.56518: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpw0l8jlud /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py <<< 11728 1726882212.56521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py" <<< 11728 1726882212.56590: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpw0l8jlud" to remote "/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py" <<< 11728 1726882212.57414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.57447: stderr chunk (state=3): >>><<< 11728 1726882212.57450: stdout chunk (state=3): >>><<< 11728 1726882212.57475: done transferring module to remote 11728 1726882212.57492: _low_level_execute_command(): starting 11728 1726882212.57498: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/ /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py && sleep 0' 11728 1726882212.58096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882212.58100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882212.58102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.58107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882212.58109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882212.58111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.58214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882212.58217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.58248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.60049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.60060: stdout chunk (state=3): >>><<< 11728 1726882212.60063: stderr chunk (state=3): >>><<< 11728 1726882212.60087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.60091: _low_level_execute_command(): starting 11728 1726882212.60097: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/AnsiballZ_ping.py && sleep 0' 11728 1726882212.60534: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882212.60537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882212.60539: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.60541: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882212.60543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.60545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.60594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.60605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.60648: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.75384: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11728 1726882212.76884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882212.76888: stdout chunk (state=3): >>><<< 11728 1726882212.76891: stderr chunk (state=3): >>><<< 11728 1726882212.76990: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882212.76998: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882212.77011: _low_level_execute_command(): starting 11728 1726882212.77032: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882212.512881-13619-148828518848705/ > /dev/null 2>&1 && sleep 0' 11728 1726882212.77890: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.77897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.77900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882212.77903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882212.77905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.77982: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.78027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.79898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.79905: stdout chunk (state=3): >>><<< 11728 1726882212.79908: stderr chunk (state=3): >>><<< 11728 1726882212.80049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.80058: handler run complete 11728 1726882212.80061: attempt loop complete, returning result 11728 1726882212.80063: _execute() done 11728 1726882212.80065: dumping result to json 11728 1726882212.80067: done dumping result, returning 11728 1726882212.80069: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-5c28-a762-0000000006a8] 11728 1726882212.80071: sending task result for task 12673a56-9f93-5c28-a762-0000000006a8 11728 1726882212.80145: done sending task result for task 12673a56-9f93-5c28-a762-0000000006a8 11728 1726882212.80151: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11728 1726882212.80228: no more pending results, returning what we have 11728 1726882212.80232: results queue empty 11728 1726882212.80233: checking for any_errors_fatal 11728 1726882212.80240: done checking for any_errors_fatal 11728 1726882212.80240: checking for max_fail_percentage 11728 1726882212.80242: done checking for max_fail_percentage 11728 1726882212.80243: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.80244: done checking to see if all hosts have failed 11728 1726882212.80245: getting the remaining hosts for this loop 11728 1726882212.80248: done getting the remaining hosts for this loop 11728 1726882212.80252: getting the next task for host managed_node3 11728 1726882212.80265: done getting next task for host managed_node3 11728 1726882212.80267: ^ task is: TASK: meta (role_complete) 11728 1726882212.80274: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.80286: getting variables 11728 1726882212.80287: in VariableManager get_vars() 11728 1726882212.80335: Calling all_inventory to load vars for managed_node3 11728 1726882212.80338: Calling groups_inventory to load vars for managed_node3 11728 1726882212.80340: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.80351: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.80354: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.80358: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.82357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.84248: done with get_vars() 11728 1726882212.84271: done getting variables 11728 1726882212.84360: done queuing things up, now waiting for results queue to drain 11728 1726882212.84363: results queue empty 11728 1726882212.84364: checking for any_errors_fatal 11728 1726882212.84366: done checking for any_errors_fatal 11728 1726882212.84367: checking for max_fail_percentage 11728 1726882212.84368: done checking for max_fail_percentage 11728 1726882212.84369: checking to see if all hosts have failed and the running result is not ok 11728 1726882212.84370: done checking to see if all hosts have failed 11728 1726882212.84370: getting the remaining hosts for this loop 11728 1726882212.84371: done getting the remaining hosts for this loop 11728 1726882212.84374: getting the next task for host managed_node3 11728 1726882212.84378: done getting next task for host managed_node3 11728 1726882212.84380: ^ task is: TASK: Delete the device '{{ controller_device }}' 11728 1726882212.84384: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882212.84387: getting variables 11728 1726882212.84388: in VariableManager get_vars() 11728 1726882212.84405: Calling all_inventory to load vars for managed_node3 11728 1726882212.84407: Calling groups_inventory to load vars for managed_node3 11728 1726882212.84409: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882212.84414: Calling all_plugins_play to load vars for managed_node3 11728 1726882212.84416: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882212.84420: Calling groups_plugins_play to load vars for managed_node3 11728 1726882212.85503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882212.86349: done with get_vars() 11728 1726882212.86363: done getting variables 11728 1726882212.86398: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882212.86486: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 21:30:12 -0400 (0:00:00.397) 0:00:37.717 ****** 11728 1726882212.86513: entering _queue_task() for managed_node3/command 11728 1726882212.86763: worker is 1 (out of 1 available) 11728 1726882212.86776: exiting _queue_task() for managed_node3/command 11728 1726882212.86789: done queuing things up, now waiting for results queue to drain 11728 1726882212.86790: waiting for pending results... 11728 1726882212.86984: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 11728 1726882212.87089: in run() - task 12673a56-9f93-5c28-a762-0000000006d8 11728 1726882212.87260: variable 'ansible_search_path' from source: unknown 11728 1726882212.87264: variable 'ansible_search_path' from source: unknown 11728 1726882212.87399: calling self._execute() 11728 1726882212.87404: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.87406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.87409: variable 'omit' from source: magic vars 11728 1726882212.87641: variable 'ansible_distribution_major_version' from source: facts 11728 1726882212.87658: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882212.87668: variable 'omit' from source: magic vars 11728 1726882212.87689: variable 'omit' from source: magic vars 11728 1726882212.87800: variable 'controller_device' from source: play vars 11728 1726882212.87849: variable 'omit' from source: magic vars 11728 1726882212.87909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882212.87967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882212.88017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882212.88059: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.88101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882212.88105: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882212.88109: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.88114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.88181: Set connection var ansible_connection to ssh 11728 1726882212.88196: Set connection var ansible_shell_executable to /bin/sh 11728 1726882212.88200: Set connection var ansible_timeout to 10 11728 1726882212.88203: Set connection var ansible_shell_type to sh 11728 1726882212.88211: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882212.88215: Set connection var ansible_pipelining to False 11728 1726882212.88233: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.88236: variable 'ansible_connection' from source: unknown 11728 1726882212.88238: variable 'ansible_module_compression' from source: unknown 11728 1726882212.88240: variable 'ansible_shell_type' from source: unknown 11728 1726882212.88243: variable 'ansible_shell_executable' from source: unknown 11728 1726882212.88245: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882212.88249: variable 'ansible_pipelining' from source: unknown 11728 1726882212.88251: variable 'ansible_timeout' from source: unknown 11728 1726882212.88255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882212.88355: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882212.88363: variable 'omit' from source: magic vars 11728 1726882212.88368: starting attempt loop 11728 1726882212.88373: running the handler 11728 1726882212.88387: _low_level_execute_command(): starting 11728 1726882212.88397: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882212.88881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.88885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.88888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882212.88891: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.88950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.88953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.88999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.90640: stdout chunk (state=3): >>>/root <<< 11728 1726882212.90884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.90888: stdout chunk (state=3): >>><<< 11728 1726882212.90890: stderr chunk (state=3): >>><<< 11728 1726882212.90898: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.90901: _low_level_execute_command(): starting 11728 1726882212.90904: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515 `" && echo ansible-tmp-1726882212.9080298-13633-240149832619515="` echo /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515 `" ) && sleep 0' 11728 1726882212.91350: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.91354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.91365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882212.91367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882212.91370: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882212.91387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.91392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.91447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.91451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.91508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.93396: stdout chunk (state=3): >>>ansible-tmp-1726882212.9080298-13633-240149832619515=/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515 <<< 11728 1726882212.93527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.93538: stderr chunk (state=3): >>><<< 11728 1726882212.93547: stdout chunk (state=3): >>><<< 11728 1726882212.93698: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882212.9080298-13633-240149832619515=/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.93702: variable 'ansible_module_compression' from source: unknown 11728 1726882212.93705: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882212.93709: variable 'ansible_facts' from source: unknown 11728 1726882212.93789: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py 11728 1726882212.94058: Sending initial data 11728 1726882212.94061: Sent initial data (156 bytes) 11728 1726882212.94592: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882212.94620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.94661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.94672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.94726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.96248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882212.96324: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882212.96384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp3_o70dqc /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py <<< 11728 1726882212.96387: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py" <<< 11728 1726882212.96444: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp3_o70dqc" to remote "/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py" <<< 11728 1726882212.97172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.97205: stderr chunk (state=3): >>><<< 11728 1726882212.97209: stdout chunk (state=3): >>><<< 11728 1726882212.97255: done transferring module to remote 11728 1726882212.97265: _low_level_execute_command(): starting 11728 1726882212.97270: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/ /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py && sleep 0' 11728 1726882212.97682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.97685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.97689: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.97692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.97742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882212.97745: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882212.97796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882212.99539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882212.99558: stderr chunk (state=3): >>><<< 11728 1726882212.99561: stdout chunk (state=3): >>><<< 11728 1726882212.99573: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882212.99576: _low_level_execute_command(): starting 11728 1726882212.99581: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/AnsiballZ_command.py && sleep 0' 11728 1726882212.99982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882212.99985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882212.99987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882212.99989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882212.99991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.00040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.00043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.00097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.15834: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:13.149162", "end": "2024-09-20 21:30:13.156561", "delta": "0:00:00.007399", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882213.17245: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. <<< 11728 1726882213.17271: stderr chunk (state=3): >>><<< 11728 1726882213.17274: stdout chunk (state=3): >>><<< 11728 1726882213.17295: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:13.149162", "end": "2024-09-20 21:30:13.156561", "delta": "0:00:00.007399", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. 11728 1726882213.17330: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882213.17337: _low_level_execute_command(): starting 11728 1726882213.17342: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882212.9080298-13633-240149832619515/ > /dev/null 2>&1 && sleep 0' 11728 1726882213.17768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.17775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.17800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.17803: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.17814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.17864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.17868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.17873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.17917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.19737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.19761: stderr chunk (state=3): >>><<< 11728 1726882213.19764: stdout chunk (state=3): >>><<< 11728 1726882213.19777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.19783: handler run complete 11728 1726882213.19809: Evaluated conditional (False): False 11728 1726882213.19813: Evaluated conditional (False): False 11728 1726882213.19818: attempt loop complete, returning result 11728 1726882213.19821: _execute() done 11728 1726882213.19824: dumping result to json 11728 1726882213.19830: done dumping result, returning 11728 1726882213.19838: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [12673a56-9f93-5c28-a762-0000000006d8] 11728 1726882213.19843: sending task result for task 12673a56-9f93-5c28-a762-0000000006d8 11728 1726882213.19939: done sending task result for task 12673a56-9f93-5c28-a762-0000000006d8 11728 1726882213.19942: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007399", "end": "2024-09-20 21:30:13.156561", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:30:13.149162" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11728 1726882213.20007: no more pending results, returning what we have 11728 1726882213.20011: results queue empty 11728 1726882213.20012: checking for any_errors_fatal 11728 1726882213.20014: done checking for any_errors_fatal 11728 1726882213.20014: checking for max_fail_percentage 11728 1726882213.20016: done checking for max_fail_percentage 11728 1726882213.20017: checking to see if all hosts have failed and the running result is not ok 11728 1726882213.20017: done checking to see if all hosts have failed 11728 1726882213.20018: getting the remaining hosts for this loop 11728 1726882213.20020: done getting the remaining hosts for this loop 11728 1726882213.20023: getting the next task for host managed_node3 11728 1726882213.20033: done getting next task for host managed_node3 11728 1726882213.20036: ^ task is: TASK: Remove test interfaces 11728 1726882213.20040: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882213.20044: getting variables 11728 1726882213.20045: in VariableManager get_vars() 11728 1726882213.20084: Calling all_inventory to load vars for managed_node3 11728 1726882213.20086: Calling groups_inventory to load vars for managed_node3 11728 1726882213.20088: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882213.20101: Calling all_plugins_play to load vars for managed_node3 11728 1726882213.20104: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882213.20107: Calling groups_plugins_play to load vars for managed_node3 11728 1726882213.22804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882213.26301: done with get_vars() 11728 1726882213.26330: done getting variables 11728 1726882213.26398: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:30:13 -0400 (0:00:00.399) 0:00:38.116 ****** 11728 1726882213.26434: entering _queue_task() for managed_node3/shell 11728 1726882213.26833: worker is 1 (out of 1 available) 11728 1726882213.26844: exiting _queue_task() for managed_node3/shell 11728 1726882213.26856: done queuing things up, now waiting for results queue to drain 11728 1726882213.26857: waiting for pending results... 11728 1726882213.27144: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 11728 1726882213.27237: in run() - task 12673a56-9f93-5c28-a762-0000000006de 11728 1726882213.27251: variable 'ansible_search_path' from source: unknown 11728 1726882213.27254: variable 'ansible_search_path' from source: unknown 11728 1726882213.27280: calling self._execute() 11728 1726882213.27362: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.27366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.27376: variable 'omit' from source: magic vars 11728 1726882213.27650: variable 'ansible_distribution_major_version' from source: facts 11728 1726882213.27659: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882213.27664: variable 'omit' from source: magic vars 11728 1726882213.27704: variable 'omit' from source: magic vars 11728 1726882213.27810: variable 'dhcp_interface1' from source: play vars 11728 1726882213.27814: variable 'dhcp_interface2' from source: play vars 11728 1726882213.27831: variable 'omit' from source: magic vars 11728 1726882213.27861: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882213.27888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882213.27907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882213.27921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882213.27931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882213.27953: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882213.27957: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.27959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.28028: Set connection var ansible_connection to ssh 11728 1726882213.28036: Set connection var ansible_shell_executable to /bin/sh 11728 1726882213.28042: Set connection var ansible_timeout to 10 11728 1726882213.28044: Set connection var ansible_shell_type to sh 11728 1726882213.28050: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882213.28055: Set connection var ansible_pipelining to False 11728 1726882213.28073: variable 'ansible_shell_executable' from source: unknown 11728 1726882213.28076: variable 'ansible_connection' from source: unknown 11728 1726882213.28078: variable 'ansible_module_compression' from source: unknown 11728 1726882213.28080: variable 'ansible_shell_type' from source: unknown 11728 1726882213.28083: variable 'ansible_shell_executable' from source: unknown 11728 1726882213.28085: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.28089: variable 'ansible_pipelining' from source: unknown 11728 1726882213.28091: variable 'ansible_timeout' from source: unknown 11728 1726882213.28099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.28195: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882213.28205: variable 'omit' from source: magic vars 11728 1726882213.28211: starting attempt loop 11728 1726882213.28214: running the handler 11728 1726882213.28226: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882213.28240: _low_level_execute_command(): starting 11728 1726882213.28247: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882213.28732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.28735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.28738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882213.28741: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.28782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.28798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.28849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.30516: stdout chunk (state=3): >>>/root <<< 11728 1726882213.30750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.30754: stdout chunk (state=3): >>><<< 11728 1726882213.30756: stderr chunk (state=3): >>><<< 11728 1726882213.30760: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.30762: _low_level_execute_command(): starting 11728 1726882213.30765: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406 `" && echo ansible-tmp-1726882213.3067036-13654-59740450129406="` echo /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406 `" ) && sleep 0' 11728 1726882213.31309: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.31404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882213.31407: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.31426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.31518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.33386: stdout chunk (state=3): >>>ansible-tmp-1726882213.3067036-13654-59740450129406=/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406 <<< 11728 1726882213.33604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.33608: stdout chunk (state=3): >>><<< 11728 1726882213.33610: stderr chunk (state=3): >>><<< 11728 1726882213.33613: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882213.3067036-13654-59740450129406=/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.33615: variable 'ansible_module_compression' from source: unknown 11728 1726882213.33710: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882213.33713: variable 'ansible_facts' from source: unknown 11728 1726882213.33796: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py 11728 1726882213.33971: Sending initial data 11728 1726882213.33974: Sent initial data (155 bytes) 11728 1726882213.34606: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882213.34617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.34711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.34726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.34748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.34824: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.36361: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882213.36412: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882213.36473: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9wi7iis6 /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py <<< 11728 1726882213.36477: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py" <<< 11728 1726882213.36528: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9wi7iis6" to remote "/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py" <<< 11728 1726882213.37341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.37466: stderr chunk (state=3): >>><<< 11728 1726882213.37469: stdout chunk (state=3): >>><<< 11728 1726882213.37482: done transferring module to remote 11728 1726882213.37499: _low_level_execute_command(): starting 11728 1726882213.37510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/ /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py && sleep 0' 11728 1726882213.38124: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882213.38146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.38159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.38171: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.38186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.38248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.38266: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.38312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.40100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.40103: stdout chunk (state=3): >>><<< 11728 1726882213.40106: stderr chunk (state=3): >>><<< 11728 1726882213.40108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.40111: _low_level_execute_command(): starting 11728 1726882213.40113: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/AnsiballZ_command.py && sleep 0' 11728 1726882213.40684: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882213.40704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.40710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.40779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.40864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.40902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.60912: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:13.558975", "end": "2024-09-20 21:30:13.606135", "delta": "0:00:00.047160", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882213.62564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882213.62568: stdout chunk (state=3): >>><<< 11728 1726882213.62571: stderr chunk (state=3): >>><<< 11728 1726882213.62634: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:13.558975", "end": "2024-09-20 21:30:13.606135", "delta": "0:00:00.047160", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882213.62678: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882213.62701: _low_level_execute_command(): starting 11728 1726882213.62705: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882213.3067036-13654-59740450129406/ > /dev/null 2>&1 && sleep 0' 11728 1726882213.63286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.63289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882213.63291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.63298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.63300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.63357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.63361: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.63370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.63431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.65309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.65313: stdout chunk (state=3): >>><<< 11728 1726882213.65403: stderr chunk (state=3): >>><<< 11728 1726882213.65407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.65410: handler run complete 11728 1726882213.65412: Evaluated conditional (False): False 11728 1726882213.65414: attempt loop complete, returning result 11728 1726882213.65416: _execute() done 11728 1726882213.65418: dumping result to json 11728 1726882213.65420: done dumping result, returning 11728 1726882213.65422: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [12673a56-9f93-5c28-a762-0000000006de] 11728 1726882213.65424: sending task result for task 12673a56-9f93-5c28-a762-0000000006de 11728 1726882213.65616: done sending task result for task 12673a56-9f93-5c28-a762-0000000006de 11728 1726882213.65619: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.047160", "end": "2024-09-20 21:30:13.606135", "rc": 0, "start": "2024-09-20 21:30:13.558975" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11728 1726882213.65683: no more pending results, returning what we have 11728 1726882213.65686: results queue empty 11728 1726882213.65687: checking for any_errors_fatal 11728 1726882213.65699: done checking for any_errors_fatal 11728 1726882213.65700: checking for max_fail_percentage 11728 1726882213.65702: done checking for max_fail_percentage 11728 1726882213.65703: checking to see if all hosts have failed and the running result is not ok 11728 1726882213.65704: done checking to see if all hosts have failed 11728 1726882213.65704: getting the remaining hosts for this loop 11728 1726882213.65706: done getting the remaining hosts for this loop 11728 1726882213.65709: getting the next task for host managed_node3 11728 1726882213.65714: done getting next task for host managed_node3 11728 1726882213.65717: ^ task is: TASK: Stop dnsmasq/radvd services 11728 1726882213.65720: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882213.65724: getting variables 11728 1726882213.65725: in VariableManager get_vars() 11728 1726882213.65832: Calling all_inventory to load vars for managed_node3 11728 1726882213.65835: Calling groups_inventory to load vars for managed_node3 11728 1726882213.65841: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882213.65851: Calling all_plugins_play to load vars for managed_node3 11728 1726882213.65854: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882213.65857: Calling groups_plugins_play to load vars for managed_node3 11728 1726882213.66959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882213.68321: done with get_vars() 11728 1726882213.68351: done getting variables 11728 1726882213.68425: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:30:13 -0400 (0:00:00.420) 0:00:38.537 ****** 11728 1726882213.68462: entering _queue_task() for managed_node3/shell 11728 1726882213.69108: worker is 1 (out of 1 available) 11728 1726882213.69122: exiting _queue_task() for managed_node3/shell 11728 1726882213.69133: done queuing things up, now waiting for results queue to drain 11728 1726882213.69135: waiting for pending results... 11728 1726882213.69354: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 11728 1726882213.69543: in run() - task 12673a56-9f93-5c28-a762-0000000006df 11728 1726882213.69549: variable 'ansible_search_path' from source: unknown 11728 1726882213.69552: variable 'ansible_search_path' from source: unknown 11728 1726882213.69614: calling self._execute() 11728 1726882213.69692: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.69743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.69749: variable 'omit' from source: magic vars 11728 1726882213.70184: variable 'ansible_distribution_major_version' from source: facts 11728 1726882213.70213: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882213.70283: variable 'omit' from source: magic vars 11728 1726882213.70291: variable 'omit' from source: magic vars 11728 1726882213.70325: variable 'omit' from source: magic vars 11728 1726882213.70366: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882213.70412: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882213.70445: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882213.70448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882213.70481: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882213.70505: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882213.70509: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.70511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.70655: Set connection var ansible_connection to ssh 11728 1726882213.70658: Set connection var ansible_shell_executable to /bin/sh 11728 1726882213.70661: Set connection var ansible_timeout to 10 11728 1726882213.70663: Set connection var ansible_shell_type to sh 11728 1726882213.70665: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882213.70666: Set connection var ansible_pipelining to False 11728 1726882213.70702: variable 'ansible_shell_executable' from source: unknown 11728 1726882213.70705: variable 'ansible_connection' from source: unknown 11728 1726882213.70708: variable 'ansible_module_compression' from source: unknown 11728 1726882213.70710: variable 'ansible_shell_type' from source: unknown 11728 1726882213.70712: variable 'ansible_shell_executable' from source: unknown 11728 1726882213.70714: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882213.70716: variable 'ansible_pipelining' from source: unknown 11728 1726882213.70720: variable 'ansible_timeout' from source: unknown 11728 1726882213.70722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882213.70868: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882213.70877: variable 'omit' from source: magic vars 11728 1726882213.70882: starting attempt loop 11728 1726882213.70888: running the handler 11728 1726882213.70902: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882213.70926: _low_level_execute_command(): starting 11728 1726882213.70932: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882213.71500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.71504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.71508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.71511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.71576: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.71579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.71583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.71621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.73229: stdout chunk (state=3): >>>/root <<< 11728 1726882213.73328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.73372: stderr chunk (state=3): >>><<< 11728 1726882213.73380: stdout chunk (state=3): >>><<< 11728 1726882213.73418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.73482: _low_level_execute_command(): starting 11728 1726882213.73491: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520 `" && echo ansible-tmp-1726882213.7342658-13678-100346306208520="` echo /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520 `" ) && sleep 0' 11728 1726882213.74004: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.74008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882213.74019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882213.74025: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882213.74048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.74090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.74095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.74144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.76023: stdout chunk (state=3): >>>ansible-tmp-1726882213.7342658-13678-100346306208520=/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520 <<< 11728 1726882213.76122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.76155: stderr chunk (state=3): >>><<< 11728 1726882213.76158: stdout chunk (state=3): >>><<< 11728 1726882213.76177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882213.7342658-13678-100346306208520=/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.76208: variable 'ansible_module_compression' from source: unknown 11728 1726882213.76253: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882213.76283: variable 'ansible_facts' from source: unknown 11728 1726882213.76343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py 11728 1726882213.76450: Sending initial data 11728 1726882213.76453: Sent initial data (156 bytes) 11728 1726882213.76916: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.76920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.76932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.76984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.76988: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.77041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.78590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882213.78636: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882213.78701: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpp6iefj8h /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py <<< 11728 1726882213.78708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py" <<< 11728 1726882213.78733: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpp6iefj8h" to remote "/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py" <<< 11728 1726882213.78736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py" <<< 11728 1726882213.79401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.79453: stderr chunk (state=3): >>><<< 11728 1726882213.79456: stdout chunk (state=3): >>><<< 11728 1726882213.79490: done transferring module to remote 11728 1726882213.79504: _low_level_execute_command(): starting 11728 1726882213.79507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/ /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py && sleep 0' 11728 1726882213.80104: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.80107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.80109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.80111: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.80117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882213.80119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.80188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882213.80192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.80249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882213.81999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882213.82028: stderr chunk (state=3): >>><<< 11728 1726882213.82031: stdout chunk (state=3): >>><<< 11728 1726882213.82046: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882213.82049: _low_level_execute_command(): starting 11728 1726882213.82054: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/AnsiballZ_command.py && sleep 0' 11728 1726882213.82586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882213.82589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.82592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882213.82596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882213.82598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882213.82650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882213.82653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882213.82708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.00685: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:13.977463", "end": "2024-09-20 21:30:14.004100", "delta": "0:00:00.026637", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882214.02279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882214.02283: stdout chunk (state=3): >>><<< 11728 1726882214.02285: stderr chunk (state=3): >>><<< 11728 1726882214.02442: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:13.977463", "end": "2024-09-20 21:30:14.004100", "delta": "0:00:00.026637", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882214.02453: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882214.02456: _low_level_execute_command(): starting 11728 1726882214.02459: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882213.7342658-13678-100346306208520/ > /dev/null 2>&1 && sleep 0' 11728 1726882214.03070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882214.03083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.03127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882214.03141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.03224: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.03257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.03275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.03302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.03382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.05254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.05258: stdout chunk (state=3): >>><<< 11728 1726882214.05263: stderr chunk (state=3): >>><<< 11728 1726882214.05307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882214.05311: handler run complete 11728 1726882214.05334: Evaluated conditional (False): False 11728 1726882214.05341: attempt loop complete, returning result 11728 1726882214.05344: _execute() done 11728 1726882214.05346: dumping result to json 11728 1726882214.05352: done dumping result, returning 11728 1726882214.05359: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [12673a56-9f93-5c28-a762-0000000006df] 11728 1726882214.05364: sending task result for task 12673a56-9f93-5c28-a762-0000000006df 11728 1726882214.05472: done sending task result for task 12673a56-9f93-5c28-a762-0000000006df 11728 1726882214.05474: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026637", "end": "2024-09-20 21:30:14.004100", "rc": 0, "start": "2024-09-20 21:30:13.977463" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11728 1726882214.05567: no more pending results, returning what we have 11728 1726882214.05570: results queue empty 11728 1726882214.05571: checking for any_errors_fatal 11728 1726882214.05579: done checking for any_errors_fatal 11728 1726882214.05580: checking for max_fail_percentage 11728 1726882214.05582: done checking for max_fail_percentage 11728 1726882214.05583: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.05583: done checking to see if all hosts have failed 11728 1726882214.05584: getting the remaining hosts for this loop 11728 1726882214.05586: done getting the remaining hosts for this loop 11728 1726882214.05589: getting the next task for host managed_node3 11728 1726882214.05603: done getting next task for host managed_node3 11728 1726882214.05607: ^ task is: TASK: Reset bond options to assert 11728 1726882214.05609: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.05612: getting variables 11728 1726882214.05614: in VariableManager get_vars() 11728 1726882214.05651: Calling all_inventory to load vars for managed_node3 11728 1726882214.05654: Calling groups_inventory to load vars for managed_node3 11728 1726882214.05656: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.05666: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.05669: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.05671: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.10430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.11747: done with get_vars() 11728 1726882214.11770: done getting variables 11728 1726882214.11813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Reset bond options to assert] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:59 Friday 20 September 2024 21:30:14 -0400 (0:00:00.433) 0:00:38.970 ****** 11728 1726882214.11832: entering _queue_task() for managed_node3/set_fact 11728 1726882214.12101: worker is 1 (out of 1 available) 11728 1726882214.12117: exiting _queue_task() for managed_node3/set_fact 11728 1726882214.12128: done queuing things up, now waiting for results queue to drain 11728 1726882214.12129: waiting for pending results... 11728 1726882214.12315: running TaskExecutor() for managed_node3/TASK: Reset bond options to assert 11728 1726882214.12379: in run() - task 12673a56-9f93-5c28-a762-00000000000f 11728 1726882214.12390: variable 'ansible_search_path' from source: unknown 11728 1726882214.12423: calling self._execute() 11728 1726882214.12504: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.12509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.12517: variable 'omit' from source: magic vars 11728 1726882214.12802: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.12812: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.12819: variable 'omit' from source: magic vars 11728 1726882214.12841: variable 'omit' from source: magic vars 11728 1726882214.12867: variable 'dhcp_interface1' from source: play vars 11728 1726882214.12920: variable 'dhcp_interface1' from source: play vars 11728 1726882214.12935: variable 'omit' from source: magic vars 11728 1726882214.12967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882214.12997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.13014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882214.13028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.13039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.13064: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.13067: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.13070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.13136: Set connection var ansible_connection to ssh 11728 1726882214.13144: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.13149: Set connection var ansible_timeout to 10 11728 1726882214.13152: Set connection var ansible_shell_type to sh 11728 1726882214.13159: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.13163: Set connection var ansible_pipelining to False 11728 1726882214.13181: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.13185: variable 'ansible_connection' from source: unknown 11728 1726882214.13188: variable 'ansible_module_compression' from source: unknown 11728 1726882214.13190: variable 'ansible_shell_type' from source: unknown 11728 1726882214.13196: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.13199: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.13202: variable 'ansible_pipelining' from source: unknown 11728 1726882214.13205: variable 'ansible_timeout' from source: unknown 11728 1726882214.13207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.13308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.13318: variable 'omit' from source: magic vars 11728 1726882214.13324: starting attempt loop 11728 1726882214.13328: running the handler 11728 1726882214.13337: handler run complete 11728 1726882214.13351: attempt loop complete, returning result 11728 1726882214.13354: _execute() done 11728 1726882214.13356: dumping result to json 11728 1726882214.13359: done dumping result, returning 11728 1726882214.13365: done running TaskExecutor() for managed_node3/TASK: Reset bond options to assert [12673a56-9f93-5c28-a762-00000000000f] 11728 1726882214.13370: sending task result for task 12673a56-9f93-5c28-a762-00000000000f 11728 1726882214.13456: done sending task result for task 12673a56-9f93-5c28-a762-00000000000f 11728 1726882214.13459: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "bond_options_to_assert": [ { "key": "mode", "value": "active-backup" }, { "key": "arp_interval", "value": "60" }, { "key": "arp_ip_target", "value": "192.0.2.128" }, { "key": "arp_validate", "value": "none" }, { "key": "primary", "value": "test1" } ] }, "changed": false } 11728 1726882214.13534: no more pending results, returning what we have 11728 1726882214.13538: results queue empty 11728 1726882214.13539: checking for any_errors_fatal 11728 1726882214.13550: done checking for any_errors_fatal 11728 1726882214.13550: checking for max_fail_percentage 11728 1726882214.13552: done checking for max_fail_percentage 11728 1726882214.13553: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.13554: done checking to see if all hosts have failed 11728 1726882214.13555: getting the remaining hosts for this loop 11728 1726882214.13557: done getting the remaining hosts for this loop 11728 1726882214.13560: getting the next task for host managed_node3 11728 1726882214.13569: done getting next task for host managed_node3 11728 1726882214.13572: ^ task is: TASK: Include the task 'run_test.yml' 11728 1726882214.13573: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.13577: getting variables 11728 1726882214.13578: in VariableManager get_vars() 11728 1726882214.13616: Calling all_inventory to load vars for managed_node3 11728 1726882214.13618: Calling groups_inventory to load vars for managed_node3 11728 1726882214.13621: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.13630: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.13632: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.13634: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.14855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.16457: done with get_vars() 11728 1726882214.16479: done getting variables TASK [Include the task 'run_test.yml'] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:72 Friday 20 September 2024 21:30:14 -0400 (0:00:00.047) 0:00:39.018 ****** 11728 1726882214.16602: entering _queue_task() for managed_node3/include_tasks 11728 1726882214.16935: worker is 1 (out of 1 available) 11728 1726882214.16950: exiting _queue_task() for managed_node3/include_tasks 11728 1726882214.16960: done queuing things up, now waiting for results queue to drain 11728 1726882214.16961: waiting for pending results... 11728 1726882214.17236: running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' 11728 1726882214.17322: in run() - task 12673a56-9f93-5c28-a762-000000000011 11728 1726882214.17345: variable 'ansible_search_path' from source: unknown 11728 1726882214.17409: calling self._execute() 11728 1726882214.17484: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.17490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.17518: variable 'omit' from source: magic vars 11728 1726882214.17872: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.17881: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.17888: _execute() done 11728 1726882214.17891: dumping result to json 11728 1726882214.17898: done dumping result, returning 11728 1726882214.17901: done running TaskExecutor() for managed_node3/TASK: Include the task 'run_test.yml' [12673a56-9f93-5c28-a762-000000000011] 11728 1726882214.17929: sending task result for task 12673a56-9f93-5c28-a762-000000000011 11728 1726882214.18066: done sending task result for task 12673a56-9f93-5c28-a762-000000000011 11728 1726882214.18069: WORKER PROCESS EXITING 11728 1726882214.18134: no more pending results, returning what we have 11728 1726882214.18139: in VariableManager get_vars() 11728 1726882214.18176: Calling all_inventory to load vars for managed_node3 11728 1726882214.18179: Calling groups_inventory to load vars for managed_node3 11728 1726882214.18181: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.18190: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.18198: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.18201: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.19257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.20114: done with get_vars() 11728 1726882214.20126: variable 'ansible_search_path' from source: unknown 11728 1726882214.20137: we have included files to process 11728 1726882214.20138: generating all_blocks data 11728 1726882214.20142: done generating all_blocks data 11728 1726882214.20146: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882214.20147: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882214.20149: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml 11728 1726882214.20422: in VariableManager get_vars() 11728 1726882214.20436: done with get_vars() 11728 1726882214.20462: in VariableManager get_vars() 11728 1726882214.20475: done with get_vars() 11728 1726882214.20504: in VariableManager get_vars() 11728 1726882214.20516: done with get_vars() 11728 1726882214.20541: in VariableManager get_vars() 11728 1726882214.20555: done with get_vars() 11728 1726882214.20581: in VariableManager get_vars() 11728 1726882214.20596: done with get_vars() 11728 1726882214.20848: in VariableManager get_vars() 11728 1726882214.20859: done with get_vars() 11728 1726882214.20867: done processing included file 11728 1726882214.20868: iterating over new_blocks loaded from include file 11728 1726882214.20869: in VariableManager get_vars() 11728 1726882214.20878: done with get_vars() 11728 1726882214.20879: filtering new block on tags 11728 1726882214.20941: done filtering new block on tags 11728 1726882214.20943: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml for managed_node3 11728 1726882214.20946: extending task lists for all hosts with included blocks 11728 1726882214.20967: done extending task lists 11728 1726882214.20968: done processing included files 11728 1726882214.20968: results queue empty 11728 1726882214.20969: checking for any_errors_fatal 11728 1726882214.20971: done checking for any_errors_fatal 11728 1726882214.20971: checking for max_fail_percentage 11728 1726882214.20972: done checking for max_fail_percentage 11728 1726882214.20972: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.20973: done checking to see if all hosts have failed 11728 1726882214.20973: getting the remaining hosts for this loop 11728 1726882214.20974: done getting the remaining hosts for this loop 11728 1726882214.20975: getting the next task for host managed_node3 11728 1726882214.20978: done getting next task for host managed_node3 11728 1726882214.20979: ^ task is: TASK: TEST: {{ lsr_description }} 11728 1726882214.20981: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.20983: getting variables 11728 1726882214.20984: in VariableManager get_vars() 11728 1726882214.20990: Calling all_inventory to load vars for managed_node3 11728 1726882214.20992: Calling groups_inventory to load vars for managed_node3 11728 1726882214.20997: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.21001: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.21002: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.21005: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.21904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.23290: done with get_vars() 11728 1726882214.23320: done getting variables 11728 1726882214.23366: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882214.23509: variable 'lsr_description' from source: include params TASK [TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:5 Friday 20 September 2024 21:30:14 -0400 (0:00:00.069) 0:00:39.087 ****** 11728 1726882214.23541: entering _queue_task() for managed_node3/debug 11728 1726882214.23869: worker is 1 (out of 1 available) 11728 1726882214.23883: exiting _queue_task() for managed_node3/debug 11728 1726882214.23899: done queuing things up, now waiting for results queue to drain 11728 1726882214.23901: waiting for pending results... 11728 1726882214.24130: running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. 11728 1726882214.24206: in run() - task 12673a56-9f93-5c28-a762-0000000008ea 11728 1726882214.24217: variable 'ansible_search_path' from source: unknown 11728 1726882214.24221: variable 'ansible_search_path' from source: unknown 11728 1726882214.24276: calling self._execute() 11728 1726882214.24334: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.24337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.24346: variable 'omit' from source: magic vars 11728 1726882214.24622: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.24632: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.24638: variable 'omit' from source: magic vars 11728 1726882214.24663: variable 'omit' from source: magic vars 11728 1726882214.24732: variable 'lsr_description' from source: include params 11728 1726882214.24746: variable 'omit' from source: magic vars 11728 1726882214.24780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882214.24819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.24836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882214.24850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.24860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.24883: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.24886: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.24889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.24960: Set connection var ansible_connection to ssh 11728 1726882214.24968: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.24974: Set connection var ansible_timeout to 10 11728 1726882214.24977: Set connection var ansible_shell_type to sh 11728 1726882214.24984: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.24988: Set connection var ansible_pipelining to False 11728 1726882214.25009: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.25012: variable 'ansible_connection' from source: unknown 11728 1726882214.25015: variable 'ansible_module_compression' from source: unknown 11728 1726882214.25018: variable 'ansible_shell_type' from source: unknown 11728 1726882214.25020: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.25024: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.25028: variable 'ansible_pipelining' from source: unknown 11728 1726882214.25031: variable 'ansible_timeout' from source: unknown 11728 1726882214.25033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.25127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.25136: variable 'omit' from source: magic vars 11728 1726882214.25141: starting attempt loop 11728 1726882214.25152: running the handler 11728 1726882214.25181: handler run complete 11728 1726882214.25191: attempt loop complete, returning result 11728 1726882214.25198: _execute() done 11728 1726882214.25201: dumping result to json 11728 1726882214.25203: done dumping result, returning 11728 1726882214.25208: done running TaskExecutor() for managed_node3/TASK: TEST: Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. [12673a56-9f93-5c28-a762-0000000008ea] 11728 1726882214.25213: sending task result for task 12673a56-9f93-5c28-a762-0000000008ea 11728 1726882214.25300: done sending task result for task 12673a56-9f93-5c28-a762-0000000008ea 11728 1726882214.25303: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ########## Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device. ########## 11728 1726882214.25360: no more pending results, returning what we have 11728 1726882214.25364: results queue empty 11728 1726882214.25365: checking for any_errors_fatal 11728 1726882214.25366: done checking for any_errors_fatal 11728 1726882214.25367: checking for max_fail_percentage 11728 1726882214.25368: done checking for max_fail_percentage 11728 1726882214.25369: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.25370: done checking to see if all hosts have failed 11728 1726882214.25370: getting the remaining hosts for this loop 11728 1726882214.25372: done getting the remaining hosts for this loop 11728 1726882214.25375: getting the next task for host managed_node3 11728 1726882214.25381: done getting next task for host managed_node3 11728 1726882214.25383: ^ task is: TASK: Show item 11728 1726882214.25386: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.25390: getting variables 11728 1726882214.25391: in VariableManager get_vars() 11728 1726882214.25433: Calling all_inventory to load vars for managed_node3 11728 1726882214.25436: Calling groups_inventory to load vars for managed_node3 11728 1726882214.25439: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.25448: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.25450: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.25453: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.26813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.28610: done with get_vars() 11728 1726882214.28632: done getting variables 11728 1726882214.28702: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show item] *************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:9 Friday 20 September 2024 21:30:14 -0400 (0:00:00.051) 0:00:39.139 ****** 11728 1726882214.28747: entering _queue_task() for managed_node3/debug 11728 1726882214.29140: worker is 1 (out of 1 available) 11728 1726882214.29156: exiting _queue_task() for managed_node3/debug 11728 1726882214.29167: done queuing things up, now waiting for results queue to drain 11728 1726882214.29169: waiting for pending results... 11728 1726882214.29607: running TaskExecutor() for managed_node3/TASK: Show item 11728 1726882214.29613: in run() - task 12673a56-9f93-5c28-a762-0000000008eb 11728 1726882214.29617: variable 'ansible_search_path' from source: unknown 11728 1726882214.29622: variable 'ansible_search_path' from source: unknown 11728 1726882214.29662: variable 'omit' from source: magic vars 11728 1726882214.29836: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.29853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.29872: variable 'omit' from source: magic vars 11728 1726882214.30462: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.30504: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.30507: variable 'omit' from source: magic vars 11728 1726882214.30541: variable 'omit' from source: magic vars 11728 1726882214.30629: variable 'item' from source: unknown 11728 1726882214.30903: variable 'item' from source: unknown 11728 1726882214.30907: variable 'omit' from source: magic vars 11728 1726882214.30912: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882214.30916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.30944: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882214.30970: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.30990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.31037: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.31050: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.31060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.31216: Set connection var ansible_connection to ssh 11728 1726882214.31235: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.31256: Set connection var ansible_timeout to 10 11728 1726882214.31266: Set connection var ansible_shell_type to sh 11728 1726882214.31288: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.31369: Set connection var ansible_pipelining to False 11728 1726882214.31376: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.31383: variable 'ansible_connection' from source: unknown 11728 1726882214.31388: variable 'ansible_module_compression' from source: unknown 11728 1726882214.31390: variable 'ansible_shell_type' from source: unknown 11728 1726882214.31396: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.31405: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.31416: variable 'ansible_pipelining' from source: unknown 11728 1726882214.31424: variable 'ansible_timeout' from source: unknown 11728 1726882214.31433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.31700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.31704: variable 'omit' from source: magic vars 11728 1726882214.31707: starting attempt loop 11728 1726882214.31713: running the handler 11728 1726882214.31716: variable 'lsr_description' from source: include params 11728 1726882214.31765: variable 'lsr_description' from source: include params 11728 1726882214.31782: handler run complete 11728 1726882214.31815: attempt loop complete, returning result 11728 1726882214.31832: variable 'item' from source: unknown 11728 1726882214.31896: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_description) => { "ansible_loop_var": "item", "item": "lsr_description", "lsr_description": "Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device." } 11728 1726882214.32401: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.32404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.32406: variable 'omit' from source: magic vars 11728 1726882214.32408: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.32410: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.32412: variable 'omit' from source: magic vars 11728 1726882214.32414: variable 'omit' from source: magic vars 11728 1726882214.32415: variable 'item' from source: unknown 11728 1726882214.32461: variable 'item' from source: unknown 11728 1726882214.32478: variable 'omit' from source: magic vars 11728 1726882214.32503: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.32516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.32534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.32550: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.32557: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.32565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.32641: Set connection var ansible_connection to ssh 11728 1726882214.33025: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.33028: Set connection var ansible_timeout to 10 11728 1726882214.33031: Set connection var ansible_shell_type to sh 11728 1726882214.33033: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.33035: Set connection var ansible_pipelining to False 11728 1726882214.33036: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.33038: variable 'ansible_connection' from source: unknown 11728 1726882214.33040: variable 'ansible_module_compression' from source: unknown 11728 1726882214.33041: variable 'ansible_shell_type' from source: unknown 11728 1726882214.33043: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.33045: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.33046: variable 'ansible_pipelining' from source: unknown 11728 1726882214.33048: variable 'ansible_timeout' from source: unknown 11728 1726882214.33050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.33052: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.33134: variable 'omit' from source: magic vars 11728 1726882214.33138: starting attempt loop 11728 1726882214.33140: running the handler 11728 1726882214.33142: variable 'lsr_setup' from source: include params 11728 1726882214.33219: variable 'lsr_setup' from source: include params 11728 1726882214.33263: handler run complete 11728 1726882214.33276: attempt loop complete, returning result 11728 1726882214.33291: variable 'item' from source: unknown 11728 1726882214.33355: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_setup) => { "ansible_loop_var": "item", "item": "lsr_setup", "lsr_setup": [ "tasks/create_test_interfaces_with_dhcp.yml", "tasks/assert_dhcp_device_present.yml" ] } 11728 1726882214.33441: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.33444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.33447: variable 'omit' from source: magic vars 11728 1726882214.33699: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.33703: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.33705: variable 'omit' from source: magic vars 11728 1726882214.33708: variable 'omit' from source: magic vars 11728 1726882214.33710: variable 'item' from source: unknown 11728 1726882214.33712: variable 'item' from source: unknown 11728 1726882214.33728: variable 'omit' from source: magic vars 11728 1726882214.33753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.33756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.33758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.33784: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.33789: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.33792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.33847: Set connection var ansible_connection to ssh 11728 1726882214.33855: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.33860: Set connection var ansible_timeout to 10 11728 1726882214.33862: Set connection var ansible_shell_type to sh 11728 1726882214.33868: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.33877: Set connection var ansible_pipelining to False 11728 1726882214.33921: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.33924: variable 'ansible_connection' from source: unknown 11728 1726882214.33927: variable 'ansible_module_compression' from source: unknown 11728 1726882214.33929: variable 'ansible_shell_type' from source: unknown 11728 1726882214.33931: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.33934: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.33936: variable 'ansible_pipelining' from source: unknown 11728 1726882214.33938: variable 'ansible_timeout' from source: unknown 11728 1726882214.33943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.34002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.34013: variable 'omit' from source: magic vars 11728 1726882214.34016: starting attempt loop 11728 1726882214.34019: running the handler 11728 1726882214.34033: variable 'lsr_test' from source: include params 11728 1726882214.34082: variable 'lsr_test' from source: include params 11728 1726882214.34100: handler run complete 11728 1726882214.34113: attempt loop complete, returning result 11728 1726882214.34124: variable 'item' from source: unknown 11728 1726882214.34164: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_test) => { "ansible_loop_var": "item", "item": "lsr_test", "lsr_test": [ "tasks/create_bond_profile_reconfigure.yml" ] } 11728 1726882214.34240: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.34243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.34245: variable 'omit' from source: magic vars 11728 1726882214.34344: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.34347: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.34352: variable 'omit' from source: magic vars 11728 1726882214.34366: variable 'omit' from source: magic vars 11728 1726882214.34391: variable 'item' from source: unknown 11728 1726882214.34437: variable 'item' from source: unknown 11728 1726882214.34447: variable 'omit' from source: magic vars 11728 1726882214.34461: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.34471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.34474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.34481: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.34484: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.34486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.34529: Set connection var ansible_connection to ssh 11728 1726882214.34536: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.34541: Set connection var ansible_timeout to 10 11728 1726882214.34543: Set connection var ansible_shell_type to sh 11728 1726882214.34549: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.34554: Set connection var ansible_pipelining to False 11728 1726882214.34568: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.34570: variable 'ansible_connection' from source: unknown 11728 1726882214.34574: variable 'ansible_module_compression' from source: unknown 11728 1726882214.34577: variable 'ansible_shell_type' from source: unknown 11728 1726882214.34579: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.34582: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.34585: variable 'ansible_pipelining' from source: unknown 11728 1726882214.34587: variable 'ansible_timeout' from source: unknown 11728 1726882214.34589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.34647: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.34653: variable 'omit' from source: magic vars 11728 1726882214.34655: starting attempt loop 11728 1726882214.34658: running the handler 11728 1726882214.34672: variable 'lsr_assert' from source: include params 11728 1726882214.34719: variable 'lsr_assert' from source: include params 11728 1726882214.34730: handler run complete 11728 1726882214.34740: attempt loop complete, returning result 11728 1726882214.34750: variable 'item' from source: unknown 11728 1726882214.34791: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert) => { "ansible_loop_var": "item", "item": "lsr_assert", "lsr_assert": [ "tasks/assert_bond_options.yml" ] } 11728 1726882214.34863: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.34866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.34875: variable 'omit' from source: magic vars 11728 1726882214.35003: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.35007: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.35011: variable 'omit' from source: magic vars 11728 1726882214.35021: variable 'omit' from source: magic vars 11728 1726882214.35046: variable 'item' from source: unknown 11728 1726882214.35099: variable 'item' from source: unknown 11728 1726882214.35102: variable 'omit' from source: magic vars 11728 1726882214.35116: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.35121: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.35127: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.35136: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.35138: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.35141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.35181: Set connection var ansible_connection to ssh 11728 1726882214.35187: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.35192: Set connection var ansible_timeout to 10 11728 1726882214.35207: Set connection var ansible_shell_type to sh 11728 1726882214.35210: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.35212: Set connection var ansible_pipelining to False 11728 1726882214.35225: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.35228: variable 'ansible_connection' from source: unknown 11728 1726882214.35230: variable 'ansible_module_compression' from source: unknown 11728 1726882214.35232: variable 'ansible_shell_type' from source: unknown 11728 1726882214.35234: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.35237: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.35242: variable 'ansible_pipelining' from source: unknown 11728 1726882214.35244: variable 'ansible_timeout' from source: unknown 11728 1726882214.35248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.35302: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.35315: variable 'omit' from source: magic vars 11728 1726882214.35318: starting attempt loop 11728 1726882214.35320: running the handler 11728 1726882214.35383: handler run complete 11728 1726882214.35391: attempt loop complete, returning result 11728 1726882214.35406: variable 'item' from source: unknown 11728 1726882214.35449: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_assert_when) => { "ansible_loop_var": "item", "item": "lsr_assert_when", "lsr_assert_when": "VARIABLE IS NOT DEFINED!: 'lsr_assert_when' is undefined" } 11728 1726882214.35519: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.35533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.35536: variable 'omit' from source: magic vars 11728 1726882214.35623: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.35626: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.35636: variable 'omit' from source: magic vars 11728 1726882214.35642: variable 'omit' from source: magic vars 11728 1726882214.35667: variable 'item' from source: unknown 11728 1726882214.35710: variable 'item' from source: unknown 11728 1726882214.35720: variable 'omit' from source: magic vars 11728 1726882214.35735: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.35751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.35754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.35783: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.35786: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.35789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.35868: Set connection var ansible_connection to ssh 11728 1726882214.35872: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.35874: Set connection var ansible_timeout to 10 11728 1726882214.35876: Set connection var ansible_shell_type to sh 11728 1726882214.35878: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.35880: Set connection var ansible_pipelining to False 11728 1726882214.35882: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.35884: variable 'ansible_connection' from source: unknown 11728 1726882214.35887: variable 'ansible_module_compression' from source: unknown 11728 1726882214.35889: variable 'ansible_shell_type' from source: unknown 11728 1726882214.35891: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.35896: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.35899: variable 'ansible_pipelining' from source: unknown 11728 1726882214.35901: variable 'ansible_timeout' from source: unknown 11728 1726882214.36401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.36404: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.36407: variable 'omit' from source: magic vars 11728 1726882214.36409: starting attempt loop 11728 1726882214.36411: running the handler 11728 1726882214.36413: variable 'lsr_fail_debug' from source: play vars 11728 1726882214.36415: variable 'lsr_fail_debug' from source: play vars 11728 1726882214.36417: handler run complete 11728 1726882214.36419: attempt loop complete, returning result 11728 1726882214.36422: variable 'item' from source: unknown 11728 1726882214.36424: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_fail_debug) => { "ansible_loop_var": "item", "item": "lsr_fail_debug", "lsr_fail_debug": [ "__network_connections_result" ] } 11728 1726882214.36481: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.36484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.36487: variable 'omit' from source: magic vars 11728 1726882214.36489: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.36491: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.36497: variable 'omit' from source: magic vars 11728 1726882214.36500: variable 'omit' from source: magic vars 11728 1726882214.36502: variable 'item' from source: unknown 11728 1726882214.36504: variable 'item' from source: unknown 11728 1726882214.36506: variable 'omit' from source: magic vars 11728 1726882214.36513: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.36628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.36637: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.36640: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.36642: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.36645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.36647: Set connection var ansible_connection to ssh 11728 1726882214.36649: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.36651: Set connection var ansible_timeout to 10 11728 1726882214.36653: Set connection var ansible_shell_type to sh 11728 1726882214.36655: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.36656: Set connection var ansible_pipelining to False 11728 1726882214.36658: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.36660: variable 'ansible_connection' from source: unknown 11728 1726882214.36662: variable 'ansible_module_compression' from source: unknown 11728 1726882214.36664: variable 'ansible_shell_type' from source: unknown 11728 1726882214.36666: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.36668: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.36670: variable 'ansible_pipelining' from source: unknown 11728 1726882214.36672: variable 'ansible_timeout' from source: unknown 11728 1726882214.36674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.36734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.36901: variable 'omit' from source: magic vars 11728 1726882214.36905: starting attempt loop 11728 1726882214.36907: running the handler 11728 1726882214.36909: variable 'lsr_cleanup' from source: include params 11728 1726882214.36911: variable 'lsr_cleanup' from source: include params 11728 1726882214.36913: handler run complete 11728 1726882214.36915: attempt loop complete, returning result 11728 1726882214.36917: variable 'item' from source: unknown 11728 1726882214.36919: variable 'item' from source: unknown ok: [managed_node3] => (item=lsr_cleanup) => { "ansible_loop_var": "item", "item": "lsr_cleanup", "lsr_cleanup": [ "tasks/cleanup_bond_profile+device.yml", "tasks/remove_test_interfaces_with_dhcp.yml", "tasks/check_network_dns.yml" ] } 11728 1726882214.36981: dumping result to json 11728 1726882214.36984: done dumping result, returning 11728 1726882214.36987: done running TaskExecutor() for managed_node3/TASK: Show item [12673a56-9f93-5c28-a762-0000000008eb] 11728 1726882214.36989: sending task result for task 12673a56-9f93-5c28-a762-0000000008eb 11728 1726882214.37252: done sending task result for task 12673a56-9f93-5c28-a762-0000000008eb 11728 1726882214.37256: WORKER PROCESS EXITING 11728 1726882214.37308: no more pending results, returning what we have 11728 1726882214.37311: results queue empty 11728 1726882214.37312: checking for any_errors_fatal 11728 1726882214.37320: done checking for any_errors_fatal 11728 1726882214.37320: checking for max_fail_percentage 11728 1726882214.37322: done checking for max_fail_percentage 11728 1726882214.37323: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.37324: done checking to see if all hosts have failed 11728 1726882214.37324: getting the remaining hosts for this loop 11728 1726882214.37325: done getting the remaining hosts for this loop 11728 1726882214.37328: getting the next task for host managed_node3 11728 1726882214.37333: done getting next task for host managed_node3 11728 1726882214.37336: ^ task is: TASK: Include the task 'show_interfaces.yml' 11728 1726882214.37339: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.37342: getting variables 11728 1726882214.37343: in VariableManager get_vars() 11728 1726882214.37381: Calling all_inventory to load vars for managed_node3 11728 1726882214.37383: Calling groups_inventory to load vars for managed_node3 11728 1726882214.37386: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.37398: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.37402: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.37405: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.39026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.40641: done with get_vars() 11728 1726882214.40663: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:21 Friday 20 September 2024 21:30:14 -0400 (0:00:00.120) 0:00:39.259 ****** 11728 1726882214.40753: entering _queue_task() for managed_node3/include_tasks 11728 1726882214.41057: worker is 1 (out of 1 available) 11728 1726882214.41067: exiting _queue_task() for managed_node3/include_tasks 11728 1726882214.41079: done queuing things up, now waiting for results queue to drain 11728 1726882214.41080: waiting for pending results... 11728 1726882214.41354: running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' 11728 1726882214.41500: in run() - task 12673a56-9f93-5c28-a762-0000000008ec 11728 1726882214.41504: variable 'ansible_search_path' from source: unknown 11728 1726882214.41507: variable 'ansible_search_path' from source: unknown 11728 1726882214.41536: calling self._execute() 11728 1726882214.41699: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.41702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.41706: variable 'omit' from source: magic vars 11728 1726882214.42048: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.42072: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.42082: _execute() done 11728 1726882214.42091: dumping result to json 11728 1726882214.42102: done dumping result, returning 11728 1726882214.42113: done running TaskExecutor() for managed_node3/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-5c28-a762-0000000008ec] 11728 1726882214.42123: sending task result for task 12673a56-9f93-5c28-a762-0000000008ec 11728 1726882214.42245: no more pending results, returning what we have 11728 1726882214.42251: in VariableManager get_vars() 11728 1726882214.42302: Calling all_inventory to load vars for managed_node3 11728 1726882214.42306: Calling groups_inventory to load vars for managed_node3 11728 1726882214.42309: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.42324: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.42327: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.42331: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.43109: done sending task result for task 12673a56-9f93-5c28-a762-0000000008ec 11728 1726882214.43112: WORKER PROCESS EXITING 11728 1726882214.43853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.45366: done with get_vars() 11728 1726882214.45384: variable 'ansible_search_path' from source: unknown 11728 1726882214.45386: variable 'ansible_search_path' from source: unknown 11728 1726882214.45425: we have included files to process 11728 1726882214.45426: generating all_blocks data 11728 1726882214.45428: done generating all_blocks data 11728 1726882214.45432: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882214.45433: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882214.45435: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 11728 1726882214.45540: in VariableManager get_vars() 11728 1726882214.45561: done with get_vars() 11728 1726882214.45671: done processing included file 11728 1726882214.45673: iterating over new_blocks loaded from include file 11728 1726882214.45674: in VariableManager get_vars() 11728 1726882214.45691: done with get_vars() 11728 1726882214.45694: filtering new block on tags 11728 1726882214.45728: done filtering new block on tags 11728 1726882214.45731: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node3 11728 1726882214.45736: extending task lists for all hosts with included blocks 11728 1726882214.46148: done extending task lists 11728 1726882214.46149: done processing included files 11728 1726882214.46150: results queue empty 11728 1726882214.46151: checking for any_errors_fatal 11728 1726882214.46156: done checking for any_errors_fatal 11728 1726882214.46157: checking for max_fail_percentage 11728 1726882214.46159: done checking for max_fail_percentage 11728 1726882214.46159: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.46160: done checking to see if all hosts have failed 11728 1726882214.46161: getting the remaining hosts for this loop 11728 1726882214.46162: done getting the remaining hosts for this loop 11728 1726882214.46164: getting the next task for host managed_node3 11728 1726882214.46168: done getting next task for host managed_node3 11728 1726882214.46170: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 11728 1726882214.46172: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.46175: getting variables 11728 1726882214.46176: in VariableManager get_vars() 11728 1726882214.46185: Calling all_inventory to load vars for managed_node3 11728 1726882214.46187: Calling groups_inventory to load vars for managed_node3 11728 1726882214.46189: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.46195: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.46197: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.46200: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.47371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.48825: done with get_vars() 11728 1726882214.48845: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:30:14 -0400 (0:00:00.081) 0:00:39.341 ****** 11728 1726882214.48917: entering _queue_task() for managed_node3/include_tasks 11728 1726882214.49240: worker is 1 (out of 1 available) 11728 1726882214.49251: exiting _queue_task() for managed_node3/include_tasks 11728 1726882214.49265: done queuing things up, now waiting for results queue to drain 11728 1726882214.49266: waiting for pending results... 11728 1726882214.49477: running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' 11728 1726882214.49597: in run() - task 12673a56-9f93-5c28-a762-000000000913 11728 1726882214.49626: variable 'ansible_search_path' from source: unknown 11728 1726882214.49736: variable 'ansible_search_path' from source: unknown 11728 1726882214.49740: calling self._execute() 11728 1726882214.49770: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.49781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.49800: variable 'omit' from source: magic vars 11728 1726882214.50170: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.50188: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.50204: _execute() done 11728 1726882214.50212: dumping result to json 11728 1726882214.50219: done dumping result, returning 11728 1726882214.50230: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-5c28-a762-000000000913] 11728 1726882214.50241: sending task result for task 12673a56-9f93-5c28-a762-000000000913 11728 1726882214.50364: no more pending results, returning what we have 11728 1726882214.50370: in VariableManager get_vars() 11728 1726882214.50516: Calling all_inventory to load vars for managed_node3 11728 1726882214.50519: Calling groups_inventory to load vars for managed_node3 11728 1726882214.50521: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.50534: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.50537: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.50541: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.51059: done sending task result for task 12673a56-9f93-5c28-a762-000000000913 11728 1726882214.51062: WORKER PROCESS EXITING 11728 1726882214.51999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.53755: done with get_vars() 11728 1726882214.53775: variable 'ansible_search_path' from source: unknown 11728 1726882214.53776: variable 'ansible_search_path' from source: unknown 11728 1726882214.53818: we have included files to process 11728 1726882214.53819: generating all_blocks data 11728 1726882214.53821: done generating all_blocks data 11728 1726882214.53822: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882214.53823: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882214.53826: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 11728 1726882214.54119: done processing included file 11728 1726882214.54122: iterating over new_blocks loaded from include file 11728 1726882214.54123: in VariableManager get_vars() 11728 1726882214.54142: done with get_vars() 11728 1726882214.54144: filtering new block on tags 11728 1726882214.54185: done filtering new block on tags 11728 1726882214.54188: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node3 11728 1726882214.54196: extending task lists for all hosts with included blocks 11728 1726882214.54346: done extending task lists 11728 1726882214.54348: done processing included files 11728 1726882214.54349: results queue empty 11728 1726882214.54349: checking for any_errors_fatal 11728 1726882214.54352: done checking for any_errors_fatal 11728 1726882214.54353: checking for max_fail_percentage 11728 1726882214.54354: done checking for max_fail_percentage 11728 1726882214.54355: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.54356: done checking to see if all hosts have failed 11728 1726882214.54356: getting the remaining hosts for this loop 11728 1726882214.54358: done getting the remaining hosts for this loop 11728 1726882214.54360: getting the next task for host managed_node3 11728 1726882214.54364: done getting next task for host managed_node3 11728 1726882214.54366: ^ task is: TASK: Gather current interface info 11728 1726882214.54368: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.54371: getting variables 11728 1726882214.54372: in VariableManager get_vars() 11728 1726882214.54382: Calling all_inventory to load vars for managed_node3 11728 1726882214.54384: Calling groups_inventory to load vars for managed_node3 11728 1726882214.54385: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.54390: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.54400: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.54404: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.55579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.57171: done with get_vars() 11728 1726882214.57191: done getting variables 11728 1726882214.57242: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:30:14 -0400 (0:00:00.083) 0:00:39.425 ****** 11728 1726882214.57272: entering _queue_task() for managed_node3/command 11728 1726882214.57633: worker is 1 (out of 1 available) 11728 1726882214.57645: exiting _queue_task() for managed_node3/command 11728 1726882214.57658: done queuing things up, now waiting for results queue to drain 11728 1726882214.57659: waiting for pending results... 11728 1726882214.58011: running TaskExecutor() for managed_node3/TASK: Gather current interface info 11728 1726882214.58016: in run() - task 12673a56-9f93-5c28-a762-00000000094e 11728 1726882214.58019: variable 'ansible_search_path' from source: unknown 11728 1726882214.58032: variable 'ansible_search_path' from source: unknown 11728 1726882214.58074: calling self._execute() 11728 1726882214.58177: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.58189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.58208: variable 'omit' from source: magic vars 11728 1726882214.58591: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.58612: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.58624: variable 'omit' from source: magic vars 11728 1726882214.58679: variable 'omit' from source: magic vars 11728 1726882214.58717: variable 'omit' from source: magic vars 11728 1726882214.58758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882214.58914: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.58987: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882214.58991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.58997: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.59019: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.59027: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.59034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.59140: Set connection var ansible_connection to ssh 11728 1726882214.59155: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.59164: Set connection var ansible_timeout to 10 11728 1726882214.59170: Set connection var ansible_shell_type to sh 11728 1726882214.59181: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.59202: Set connection var ansible_pipelining to False 11728 1726882214.59229: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.59300: variable 'ansible_connection' from source: unknown 11728 1726882214.59305: variable 'ansible_module_compression' from source: unknown 11728 1726882214.59308: variable 'ansible_shell_type' from source: unknown 11728 1726882214.59311: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.59313: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.59315: variable 'ansible_pipelining' from source: unknown 11728 1726882214.59317: variable 'ansible_timeout' from source: unknown 11728 1726882214.59320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.59501: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.59505: variable 'omit' from source: magic vars 11728 1726882214.59507: starting attempt loop 11728 1726882214.59510: running the handler 11728 1726882214.59512: _low_level_execute_command(): starting 11728 1726882214.59514: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882214.60319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.60369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.60389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.60421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.60523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.62201: stdout chunk (state=3): >>>/root <<< 11728 1726882214.62413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.62417: stdout chunk (state=3): >>><<< 11728 1726882214.62419: stderr chunk (state=3): >>><<< 11728 1726882214.62439: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882214.62458: _low_level_execute_command(): starting 11728 1726882214.62535: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772 `" && echo ansible-tmp-1726882214.62446-13715-82286172900772="` echo /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772 `" ) && sleep 0' 11728 1726882214.63043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882214.63057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.63071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882214.63088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882214.63109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882214.63205: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.63232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.63248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.63325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.65234: stdout chunk (state=3): >>>ansible-tmp-1726882214.62446-13715-82286172900772=/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772 <<< 11728 1726882214.65338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.65397: stderr chunk (state=3): >>><<< 11728 1726882214.65421: stdout chunk (state=3): >>><<< 11728 1726882214.65601: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882214.62446-13715-82286172900772=/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882214.65605: variable 'ansible_module_compression' from source: unknown 11728 1726882214.65608: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882214.65610: variable 'ansible_facts' from source: unknown 11728 1726882214.65665: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py 11728 1726882214.65856: Sending initial data 11728 1726882214.65866: Sent initial data (153 bytes) 11728 1726882214.66435: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882214.66444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.66456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882214.66508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.66555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.66566: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.66574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.66656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.68249: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882214.68322: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882214.68369: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9jf1qify /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py <<< 11728 1726882214.68372: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py" <<< 11728 1726882214.68413: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp9jf1qify" to remote "/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py" <<< 11728 1726882214.69151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.69202: stderr chunk (state=3): >>><<< 11728 1726882214.69268: stdout chunk (state=3): >>><<< 11728 1726882214.69278: done transferring module to remote 11728 1726882214.69298: _low_level_execute_command(): starting 11728 1726882214.69310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/ /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py && sleep 0' 11728 1726882214.69926: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882214.69942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.69958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882214.69975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882214.69995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882214.70009: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882214.70109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.70128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.70146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.70221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.72138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.72147: stdout chunk (state=3): >>><<< 11728 1726882214.72157: stderr chunk (state=3): >>><<< 11728 1726882214.72176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882214.72183: _low_level_execute_command(): starting 11728 1726882214.72198: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/AnsiballZ_command.py && sleep 0' 11728 1726882214.72780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882214.72798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.72811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882214.72830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882214.72846: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882214.72945: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.72956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882214.72972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.73048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.88323: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:14.878135", "end": "2024-09-20 21:30:14.881419", "delta": "0:00:00.003284", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882214.89957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882214.89979: stdout chunk (state=3): >>><<< 11728 1726882214.89991: stderr chunk (state=3): >>><<< 11728 1726882214.90022: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:30:14.878135", "end": "2024-09-20 21:30:14.881419", "delta": "0:00:00.003284", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882214.90065: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882214.90088: _low_level_execute_command(): starting 11728 1726882214.90103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882214.62446-13715-82286172900772/ > /dev/null 2>&1 && sleep 0' 11728 1726882214.90927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.90930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.90932: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882214.90934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882214.90981: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882214.91012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882214.91070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882214.92962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882214.92966: stdout chunk (state=3): >>><<< 11728 1726882214.92968: stderr chunk (state=3): >>><<< 11728 1726882214.92982: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882214.92992: handler run complete 11728 1726882214.93200: Evaluated conditional (False): False 11728 1726882214.93203: attempt loop complete, returning result 11728 1726882214.93205: _execute() done 11728 1726882214.93207: dumping result to json 11728 1726882214.93209: done dumping result, returning 11728 1726882214.93210: done running TaskExecutor() for managed_node3/TASK: Gather current interface info [12673a56-9f93-5c28-a762-00000000094e] 11728 1726882214.93212: sending task result for task 12673a56-9f93-5c28-a762-00000000094e 11728 1726882214.93278: done sending task result for task 12673a56-9f93-5c28-a762-00000000094e 11728 1726882214.93280: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003284", "end": "2024-09-20 21:30:14.881419", "rc": 0, "start": "2024-09-20 21:30:14.878135" } STDOUT: bonding_masters eth0 lo 11728 1726882214.93362: no more pending results, returning what we have 11728 1726882214.93366: results queue empty 11728 1726882214.93367: checking for any_errors_fatal 11728 1726882214.93369: done checking for any_errors_fatal 11728 1726882214.93370: checking for max_fail_percentage 11728 1726882214.93372: done checking for max_fail_percentage 11728 1726882214.93373: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.93374: done checking to see if all hosts have failed 11728 1726882214.93375: getting the remaining hosts for this loop 11728 1726882214.93377: done getting the remaining hosts for this loop 11728 1726882214.93381: getting the next task for host managed_node3 11728 1726882214.93398: done getting next task for host managed_node3 11728 1726882214.93402: ^ task is: TASK: Set current_interfaces 11728 1726882214.93408: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.93414: getting variables 11728 1726882214.93415: in VariableManager get_vars() 11728 1726882214.93457: Calling all_inventory to load vars for managed_node3 11728 1726882214.93460: Calling groups_inventory to load vars for managed_node3 11728 1726882214.93462: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.93473: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.93476: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.93479: Calling groups_plugins_play to load vars for managed_node3 11728 1726882214.95355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882214.96888: done with get_vars() 11728 1726882214.96913: done getting variables 11728 1726882214.96978: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:30:14 -0400 (0:00:00.397) 0:00:39.822 ****** 11728 1726882214.97014: entering _queue_task() for managed_node3/set_fact 11728 1726882214.97338: worker is 1 (out of 1 available) 11728 1726882214.97349: exiting _queue_task() for managed_node3/set_fact 11728 1726882214.97363: done queuing things up, now waiting for results queue to drain 11728 1726882214.97364: waiting for pending results... 11728 1726882214.97719: running TaskExecutor() for managed_node3/TASK: Set current_interfaces 11728 1726882214.97733: in run() - task 12673a56-9f93-5c28-a762-00000000094f 11728 1726882214.97752: variable 'ansible_search_path' from source: unknown 11728 1726882214.97758: variable 'ansible_search_path' from source: unknown 11728 1726882214.97792: calling self._execute() 11728 1726882214.97903: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.97919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.97932: variable 'omit' from source: magic vars 11728 1726882214.98356: variable 'ansible_distribution_major_version' from source: facts 11728 1726882214.98360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882214.98363: variable 'omit' from source: magic vars 11728 1726882214.98392: variable 'omit' from source: magic vars 11728 1726882214.98510: variable '_current_interfaces' from source: set_fact 11728 1726882214.98589: variable 'omit' from source: magic vars 11728 1726882214.98638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882214.98687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882214.98797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882214.98801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.98804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882214.98806: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882214.98808: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.98815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.98921: Set connection var ansible_connection to ssh 11728 1726882214.98937: Set connection var ansible_shell_executable to /bin/sh 11728 1726882214.98948: Set connection var ansible_timeout to 10 11728 1726882214.98955: Set connection var ansible_shell_type to sh 11728 1726882214.98966: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882214.98975: Set connection var ansible_pipelining to False 11728 1726882214.99011: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.99021: variable 'ansible_connection' from source: unknown 11728 1726882214.99029: variable 'ansible_module_compression' from source: unknown 11728 1726882214.99036: variable 'ansible_shell_type' from source: unknown 11728 1726882214.99122: variable 'ansible_shell_executable' from source: unknown 11728 1726882214.99126: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882214.99128: variable 'ansible_pipelining' from source: unknown 11728 1726882214.99130: variable 'ansible_timeout' from source: unknown 11728 1726882214.99132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882214.99229: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882214.99248: variable 'omit' from source: magic vars 11728 1726882214.99259: starting attempt loop 11728 1726882214.99266: running the handler 11728 1726882214.99282: handler run complete 11728 1726882214.99300: attempt loop complete, returning result 11728 1726882214.99307: _execute() done 11728 1726882214.99315: dumping result to json 11728 1726882214.99323: done dumping result, returning 11728 1726882214.99348: done running TaskExecutor() for managed_node3/TASK: Set current_interfaces [12673a56-9f93-5c28-a762-00000000094f] 11728 1726882214.99359: sending task result for task 12673a56-9f93-5c28-a762-00000000094f 11728 1726882214.99610: done sending task result for task 12673a56-9f93-5c28-a762-00000000094f 11728 1726882214.99614: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 11728 1726882214.99671: no more pending results, returning what we have 11728 1726882214.99675: results queue empty 11728 1726882214.99676: checking for any_errors_fatal 11728 1726882214.99686: done checking for any_errors_fatal 11728 1726882214.99687: checking for max_fail_percentage 11728 1726882214.99688: done checking for max_fail_percentage 11728 1726882214.99689: checking to see if all hosts have failed and the running result is not ok 11728 1726882214.99690: done checking to see if all hosts have failed 11728 1726882214.99690: getting the remaining hosts for this loop 11728 1726882214.99696: done getting the remaining hosts for this loop 11728 1726882214.99699: getting the next task for host managed_node3 11728 1726882214.99900: done getting next task for host managed_node3 11728 1726882214.99903: ^ task is: TASK: Show current_interfaces 11728 1726882214.99908: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882214.99913: getting variables 11728 1726882214.99914: in VariableManager get_vars() 11728 1726882214.99956: Calling all_inventory to load vars for managed_node3 11728 1726882214.99959: Calling groups_inventory to load vars for managed_node3 11728 1726882214.99961: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882214.99971: Calling all_plugins_play to load vars for managed_node3 11728 1726882214.99974: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882214.99976: Calling groups_plugins_play to load vars for managed_node3 11728 1726882215.01364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882215.02902: done with get_vars() 11728 1726882215.02930: done getting variables 11728 1726882215.02990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:30:15 -0400 (0:00:00.060) 0:00:39.882 ****** 11728 1726882215.03026: entering _queue_task() for managed_node3/debug 11728 1726882215.03361: worker is 1 (out of 1 available) 11728 1726882215.03374: exiting _queue_task() for managed_node3/debug 11728 1726882215.03385: done queuing things up, now waiting for results queue to drain 11728 1726882215.03386: waiting for pending results... 11728 1726882215.03655: running TaskExecutor() for managed_node3/TASK: Show current_interfaces 11728 1726882215.03769: in run() - task 12673a56-9f93-5c28-a762-000000000914 11728 1726882215.03786: variable 'ansible_search_path' from source: unknown 11728 1726882215.03792: variable 'ansible_search_path' from source: unknown 11728 1726882215.04001: calling self._execute() 11728 1726882215.04004: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.04008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.04011: variable 'omit' from source: magic vars 11728 1726882215.04333: variable 'ansible_distribution_major_version' from source: facts 11728 1726882215.04356: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882215.04369: variable 'omit' from source: magic vars 11728 1726882215.04420: variable 'omit' from source: magic vars 11728 1726882215.04522: variable 'current_interfaces' from source: set_fact 11728 1726882215.04559: variable 'omit' from source: magic vars 11728 1726882215.04606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882215.04675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882215.04678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882215.04696: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882215.04714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882215.04750: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882215.04783: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.04787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.04868: Set connection var ansible_connection to ssh 11728 1726882215.04883: Set connection var ansible_shell_executable to /bin/sh 11728 1726882215.04900: Set connection var ansible_timeout to 10 11728 1726882215.05001: Set connection var ansible_shell_type to sh 11728 1726882215.05004: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882215.05006: Set connection var ansible_pipelining to False 11728 1726882215.05008: variable 'ansible_shell_executable' from source: unknown 11728 1726882215.05011: variable 'ansible_connection' from source: unknown 11728 1726882215.05013: variable 'ansible_module_compression' from source: unknown 11728 1726882215.05015: variable 'ansible_shell_type' from source: unknown 11728 1726882215.05017: variable 'ansible_shell_executable' from source: unknown 11728 1726882215.05019: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.05021: variable 'ansible_pipelining' from source: unknown 11728 1726882215.05023: variable 'ansible_timeout' from source: unknown 11728 1726882215.05025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.05120: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882215.05135: variable 'omit' from source: magic vars 11728 1726882215.05143: starting attempt loop 11728 1726882215.05148: running the handler 11728 1726882215.05190: handler run complete 11728 1726882215.05216: attempt loop complete, returning result 11728 1726882215.05225: _execute() done 11728 1726882215.05233: dumping result to json 11728 1726882215.05240: done dumping result, returning 11728 1726882215.05250: done running TaskExecutor() for managed_node3/TASK: Show current_interfaces [12673a56-9f93-5c28-a762-000000000914] 11728 1726882215.05260: sending task result for task 12673a56-9f93-5c28-a762-000000000914 11728 1726882215.05551: done sending task result for task 12673a56-9f93-5c28-a762-000000000914 11728 1726882215.05554: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 11728 1726882215.05602: no more pending results, returning what we have 11728 1726882215.05606: results queue empty 11728 1726882215.05607: checking for any_errors_fatal 11728 1726882215.05611: done checking for any_errors_fatal 11728 1726882215.05612: checking for max_fail_percentage 11728 1726882215.05614: done checking for max_fail_percentage 11728 1726882215.05615: checking to see if all hosts have failed and the running result is not ok 11728 1726882215.05615: done checking to see if all hosts have failed 11728 1726882215.05616: getting the remaining hosts for this loop 11728 1726882215.05618: done getting the remaining hosts for this loop 11728 1726882215.05622: getting the next task for host managed_node3 11728 1726882215.05630: done getting next task for host managed_node3 11728 1726882215.05633: ^ task is: TASK: Setup 11728 1726882215.05636: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882215.05641: getting variables 11728 1726882215.05642: in VariableManager get_vars() 11728 1726882215.05678: Calling all_inventory to load vars for managed_node3 11728 1726882215.05681: Calling groups_inventory to load vars for managed_node3 11728 1726882215.05684: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882215.05696: Calling all_plugins_play to load vars for managed_node3 11728 1726882215.05700: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882215.05703: Calling groups_plugins_play to load vars for managed_node3 11728 1726882215.07431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882215.09247: done with get_vars() 11728 1726882215.09280: done getting variables TASK [Setup] ******************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:24 Friday 20 September 2024 21:30:15 -0400 (0:00:00.063) 0:00:39.946 ****** 11728 1726882215.09386: entering _queue_task() for managed_node3/include_tasks 11728 1726882215.09747: worker is 1 (out of 1 available) 11728 1726882215.09758: exiting _queue_task() for managed_node3/include_tasks 11728 1726882215.09883: done queuing things up, now waiting for results queue to drain 11728 1726882215.09885: waiting for pending results... 11728 1726882215.10059: running TaskExecutor() for managed_node3/TASK: Setup 11728 1726882215.10164: in run() - task 12673a56-9f93-5c28-a762-0000000008ed 11728 1726882215.10182: variable 'ansible_search_path' from source: unknown 11728 1726882215.10188: variable 'ansible_search_path' from source: unknown 11728 1726882215.10247: variable 'lsr_setup' from source: include params 11728 1726882215.10469: variable 'lsr_setup' from source: include params 11728 1726882215.10599: variable 'omit' from source: magic vars 11728 1726882215.10699: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.10717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.10733: variable 'omit' from source: magic vars 11728 1726882215.11038: variable 'ansible_distribution_major_version' from source: facts 11728 1726882215.11053: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882215.11064: variable 'item' from source: unknown 11728 1726882215.11612: variable 'item' from source: unknown 11728 1726882215.11614: variable 'item' from source: unknown 11728 1726882215.11616: variable 'item' from source: unknown 11728 1726882215.12151: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.12155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.12158: variable 'omit' from source: magic vars 11728 1726882215.12608: variable 'ansible_distribution_major_version' from source: facts 11728 1726882215.12611: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882215.12613: variable 'item' from source: unknown 11728 1726882215.12616: variable 'item' from source: unknown 11728 1726882215.12743: variable 'item' from source: unknown 11728 1726882215.12968: variable 'item' from source: unknown 11728 1726882215.13163: dumping result to json 11728 1726882215.13166: done dumping result, returning 11728 1726882215.13169: done running TaskExecutor() for managed_node3/TASK: Setup [12673a56-9f93-5c28-a762-0000000008ed] 11728 1726882215.13171: sending task result for task 12673a56-9f93-5c28-a762-0000000008ed 11728 1726882215.13216: done sending task result for task 12673a56-9f93-5c28-a762-0000000008ed 11728 1726882215.13219: WORKER PROCESS EXITING 11728 1726882215.13295: no more pending results, returning what we have 11728 1726882215.13302: in VariableManager get_vars() 11728 1726882215.13352: Calling all_inventory to load vars for managed_node3 11728 1726882215.13355: Calling groups_inventory to load vars for managed_node3 11728 1726882215.13358: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882215.13373: Calling all_plugins_play to load vars for managed_node3 11728 1726882215.13376: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882215.13380: Calling groups_plugins_play to load vars for managed_node3 11728 1726882215.15700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882215.18466: done with get_vars() 11728 1726882215.18485: variable 'ansible_search_path' from source: unknown 11728 1726882215.18486: variable 'ansible_search_path' from source: unknown 11728 1726882215.18528: variable 'ansible_search_path' from source: unknown 11728 1726882215.18529: variable 'ansible_search_path' from source: unknown 11728 1726882215.18557: we have included files to process 11728 1726882215.18558: generating all_blocks data 11728 1726882215.18560: done generating all_blocks data 11728 1726882215.18564: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882215.18565: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882215.18567: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11728 1726882215.20474: done processing included file 11728 1726882215.20476: iterating over new_blocks loaded from include file 11728 1726882215.20478: in VariableManager get_vars() 11728 1726882215.20502: done with get_vars() 11728 1726882215.20504: filtering new block on tags 11728 1726882215.20559: done filtering new block on tags 11728 1726882215.20562: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/create_test_interfaces_with_dhcp.yml) 11728 1726882215.20567: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882215.20568: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882215.20572: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml 11728 1726882215.20664: in VariableManager get_vars() 11728 1726882215.20687: done with get_vars() 11728 1726882215.20696: variable 'item' from source: include params 11728 1726882215.20831: variable 'item' from source: include params 11728 1726882215.20863: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11728 1726882215.20946: in VariableManager get_vars() 11728 1726882215.20969: done with get_vars() 11728 1726882215.21121: in VariableManager get_vars() 11728 1726882215.21153: done with get_vars() 11728 1726882215.21160: variable 'item' from source: include params 11728 1726882215.21242: variable 'item' from source: include params 11728 1726882215.21272: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11728 1726882215.21442: in VariableManager get_vars() 11728 1726882215.21483: done with get_vars() 11728 1726882215.21588: done processing included file 11728 1726882215.21590: iterating over new_blocks loaded from include file 11728 1726882215.21591: in VariableManager get_vars() 11728 1726882215.21610: done with get_vars() 11728 1726882215.21612: filtering new block on tags 11728 1726882215.21695: done filtering new block on tags 11728 1726882215.21699: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_dhcp_device_present.yml for managed_node3 => (item=tasks/assert_dhcp_device_present.yml) 11728 1726882215.21703: extending task lists for all hosts with included blocks 11728 1726882215.22366: done extending task lists 11728 1726882215.22368: done processing included files 11728 1726882215.22369: results queue empty 11728 1726882215.22369: checking for any_errors_fatal 11728 1726882215.22372: done checking for any_errors_fatal 11728 1726882215.22373: checking for max_fail_percentage 11728 1726882215.22374: done checking for max_fail_percentage 11728 1726882215.22375: checking to see if all hosts have failed and the running result is not ok 11728 1726882215.22376: done checking to see if all hosts have failed 11728 1726882215.22381: getting the remaining hosts for this loop 11728 1726882215.22383: done getting the remaining hosts for this loop 11728 1726882215.22385: getting the next task for host managed_node3 11728 1726882215.22389: done getting next task for host managed_node3 11728 1726882215.22391: ^ task is: TASK: Install dnsmasq 11728 1726882215.22410: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882215.22413: getting variables 11728 1726882215.22414: in VariableManager get_vars() 11728 1726882215.22426: Calling all_inventory to load vars for managed_node3 11728 1726882215.22429: Calling groups_inventory to load vars for managed_node3 11728 1726882215.22431: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882215.22436: Calling all_plugins_play to load vars for managed_node3 11728 1726882215.22439: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882215.22442: Calling groups_plugins_play to load vars for managed_node3 11728 1726882215.23754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882215.26062: done with get_vars() 11728 1726882215.26095: done getting variables 11728 1726882215.26137: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:30:15 -0400 (0:00:00.167) 0:00:40.114 ****** 11728 1726882215.26172: entering _queue_task() for managed_node3/package 11728 1726882215.26729: worker is 1 (out of 1 available) 11728 1726882215.26739: exiting _queue_task() for managed_node3/package 11728 1726882215.26750: done queuing things up, now waiting for results queue to drain 11728 1726882215.26752: waiting for pending results... 11728 1726882215.26884: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 11728 1726882215.27091: in run() - task 12673a56-9f93-5c28-a762-000000000974 11728 1726882215.27098: variable 'ansible_search_path' from source: unknown 11728 1726882215.27100: variable 'ansible_search_path' from source: unknown 11728 1726882215.27103: calling self._execute() 11728 1726882215.27170: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.27185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.27400: variable 'omit' from source: magic vars 11728 1726882215.27849: variable 'ansible_distribution_major_version' from source: facts 11728 1726882215.27854: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882215.27857: variable 'omit' from source: magic vars 11728 1726882215.27859: variable 'omit' from source: magic vars 11728 1726882215.28400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882215.31073: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882215.31256: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882215.31265: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882215.31311: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882215.31344: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882215.31451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882215.31489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882215.31581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882215.31584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882215.31587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882215.31703: variable '__network_is_ostree' from source: set_fact 11728 1726882215.31715: variable 'omit' from source: magic vars 11728 1726882215.31753: variable 'omit' from source: magic vars 11728 1726882215.31783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882215.31820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882215.31846: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882215.31868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882215.31884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882215.31948: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882215.31952: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.31954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.32043: Set connection var ansible_connection to ssh 11728 1726882215.32065: Set connection var ansible_shell_executable to /bin/sh 11728 1726882215.32076: Set connection var ansible_timeout to 10 11728 1726882215.32124: Set connection var ansible_shell_type to sh 11728 1726882215.32128: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882215.32130: Set connection var ansible_pipelining to False 11728 1726882215.32277: variable 'ansible_shell_executable' from source: unknown 11728 1726882215.32280: variable 'ansible_connection' from source: unknown 11728 1726882215.32282: variable 'ansible_module_compression' from source: unknown 11728 1726882215.32284: variable 'ansible_shell_type' from source: unknown 11728 1726882215.32286: variable 'ansible_shell_executable' from source: unknown 11728 1726882215.32288: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882215.32290: variable 'ansible_pipelining' from source: unknown 11728 1726882215.32292: variable 'ansible_timeout' from source: unknown 11728 1726882215.32296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882215.32805: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882215.32808: variable 'omit' from source: magic vars 11728 1726882215.32811: starting attempt loop 11728 1726882215.32813: running the handler 11728 1726882215.32815: variable 'ansible_facts' from source: unknown 11728 1726882215.32817: variable 'ansible_facts' from source: unknown 11728 1726882215.32818: _low_level_execute_command(): starting 11728 1726882215.32821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882215.34114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882215.34218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.34240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.34324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.36146: stdout chunk (state=3): >>>/root <<< 11728 1726882215.36207: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882215.36215: stdout chunk (state=3): >>><<< 11728 1726882215.36228: stderr chunk (state=3): >>><<< 11728 1726882215.36254: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882215.36290: _low_level_execute_command(): starting 11728 1726882215.36299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714 `" && echo ansible-tmp-1726882215.36253-13747-113158350824714="` echo /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714 `" ) && sleep 0' 11728 1726882215.37499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882215.37695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882215.37725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882215.37824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.37828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.37831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.39706: stdout chunk (state=3): >>>ansible-tmp-1726882215.36253-13747-113158350824714=/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714 <<< 11728 1726882215.39979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882215.39983: stdout chunk (state=3): >>><<< 11728 1726882215.39991: stderr chunk (state=3): >>><<< 11728 1726882215.40162: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882215.36253-13747-113158350824714=/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882215.40165: variable 'ansible_module_compression' from source: unknown 11728 1726882215.40167: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11728 1726882215.40169: variable 'ansible_facts' from source: unknown 11728 1726882215.40450: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py 11728 1726882215.40816: Sending initial data 11728 1726882215.40819: Sent initial data (150 bytes) 11728 1726882215.41387: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882215.41390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882215.41392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882215.41503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882215.41507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882215.41510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.41512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.41520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.43221: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882215.43229: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11728 1726882215.43236: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11728 1726882215.43256: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882215.43326: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882215.43397: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjgv0cwoe /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py <<< 11728 1726882215.43401: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py" <<< 11728 1726882215.43449: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjgv0cwoe" to remote "/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py" <<< 11728 1726882215.44466: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882215.44469: stdout chunk (state=3): >>><<< 11728 1726882215.44476: stderr chunk (state=3): >>><<< 11728 1726882215.44575: done transferring module to remote 11728 1726882215.44579: _low_level_execute_command(): starting 11728 1726882215.44582: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/ /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py && sleep 0' 11728 1726882215.45199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882215.45213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882215.45231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882215.45250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882215.45352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882215.45373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.45410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.45471: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.47261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882215.47274: stdout chunk (state=3): >>><<< 11728 1726882215.47287: stderr chunk (state=3): >>><<< 11728 1726882215.47312: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882215.47321: _low_level_execute_command(): starting 11728 1726882215.47332: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/AnsiballZ_dnf.py && sleep 0' 11728 1726882215.47927: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882215.47941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882215.47956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882215.47971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882215.47984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882215.47998: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882215.48011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882215.48101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.48125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.48203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.88400: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11728 1726882215.92588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882215.92606: stdout chunk (state=3): >>><<< 11728 1726882215.92620: stderr chunk (state=3): >>><<< 11728 1726882215.92665: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882215.92723: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882215.92738: _low_level_execute_command(): starting 11728 1726882215.92747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882215.36253-13747-113158350824714/ > /dev/null 2>&1 && sleep 0' 11728 1726882215.93415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882215.93490: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882215.93508: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882215.93529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882215.93551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882215.93566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882215.93644: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882215.95524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882215.95542: stdout chunk (state=3): >>><<< 11728 1726882215.95554: stderr chunk (state=3): >>><<< 11728 1726882215.95701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882215.95705: handler run complete 11728 1726882215.95767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882215.95981: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882215.96031: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882215.96083: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882215.96127: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882215.96220: variable '__install_status' from source: set_fact 11728 1726882215.96262: Evaluated conditional (__install_status is success): True 11728 1726882215.96275: attempt loop complete, returning result 11728 1726882215.96374: _execute() done 11728 1726882215.96377: dumping result to json 11728 1726882215.96380: done dumping result, returning 11728 1726882215.96382: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [12673a56-9f93-5c28-a762-000000000974] 11728 1726882215.96384: sending task result for task 12673a56-9f93-5c28-a762-000000000974 11728 1726882215.96465: done sending task result for task 12673a56-9f93-5c28-a762-000000000974 11728 1726882215.96468: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11728 1726882215.96568: no more pending results, returning what we have 11728 1726882215.96574: results queue empty 11728 1726882215.96575: checking for any_errors_fatal 11728 1726882215.96577: done checking for any_errors_fatal 11728 1726882215.96577: checking for max_fail_percentage 11728 1726882215.96579: done checking for max_fail_percentage 11728 1726882215.96580: checking to see if all hosts have failed and the running result is not ok 11728 1726882215.96698: done checking to see if all hosts have failed 11728 1726882215.96699: getting the remaining hosts for this loop 11728 1726882215.96701: done getting the remaining hosts for this loop 11728 1726882215.96705: getting the next task for host managed_node3 11728 1726882215.96713: done getting next task for host managed_node3 11728 1726882215.96716: ^ task is: TASK: Install pgrep, sysctl 11728 1726882215.96720: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882215.96724: getting variables 11728 1726882215.96726: in VariableManager get_vars() 11728 1726882215.96766: Calling all_inventory to load vars for managed_node3 11728 1726882215.96770: Calling groups_inventory to load vars for managed_node3 11728 1726882215.96773: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882215.96784: Calling all_plugins_play to load vars for managed_node3 11728 1726882215.96787: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882215.96789: Calling groups_plugins_play to load vars for managed_node3 11728 1726882215.98520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882216.00240: done with get_vars() 11728 1726882216.00265: done getting variables 11728 1726882216.00327: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:30:16 -0400 (0:00:00.741) 0:00:40.856 ****** 11728 1726882216.00363: entering _queue_task() for managed_node3/package 11728 1726882216.00735: worker is 1 (out of 1 available) 11728 1726882216.00749: exiting _queue_task() for managed_node3/package 11728 1726882216.00763: done queuing things up, now waiting for results queue to drain 11728 1726882216.00765: waiting for pending results... 11728 1726882216.01218: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11728 1726882216.01279: in run() - task 12673a56-9f93-5c28-a762-000000000975 11728 1726882216.01308: variable 'ansible_search_path' from source: unknown 11728 1726882216.01317: variable 'ansible_search_path' from source: unknown 11728 1726882216.01368: calling self._execute() 11728 1726882216.01558: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.01564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.01568: variable 'omit' from source: magic vars 11728 1726882216.02028: variable 'ansible_distribution_major_version' from source: facts 11728 1726882216.02103: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882216.02178: variable 'ansible_os_family' from source: facts 11728 1726882216.02190: Evaluated conditional (ansible_os_family == 'RedHat'): True 11728 1726882216.02604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882216.03099: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882216.03231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882216.03369: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882216.03411: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882216.03592: variable 'ansible_distribution_major_version' from source: facts 11728 1726882216.03651: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11728 1726882216.03664: when evaluation is False, skipping this task 11728 1726882216.03672: _execute() done 11728 1726882216.03679: dumping result to json 11728 1726882216.03686: done dumping result, returning 11728 1726882216.03700: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [12673a56-9f93-5c28-a762-000000000975] 11728 1726882216.03710: sending task result for task 12673a56-9f93-5c28-a762-000000000975 11728 1726882216.04028: done sending task result for task 12673a56-9f93-5c28-a762-000000000975 11728 1726882216.04031: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11728 1726882216.04078: no more pending results, returning what we have 11728 1726882216.04082: results queue empty 11728 1726882216.04083: checking for any_errors_fatal 11728 1726882216.04099: done checking for any_errors_fatal 11728 1726882216.04100: checking for max_fail_percentage 11728 1726882216.04102: done checking for max_fail_percentage 11728 1726882216.04103: checking to see if all hosts have failed and the running result is not ok 11728 1726882216.04104: done checking to see if all hosts have failed 11728 1726882216.04104: getting the remaining hosts for this loop 11728 1726882216.04107: done getting the remaining hosts for this loop 11728 1726882216.04111: getting the next task for host managed_node3 11728 1726882216.04118: done getting next task for host managed_node3 11728 1726882216.04121: ^ task is: TASK: Install pgrep, sysctl 11728 1726882216.04125: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882216.04130: getting variables 11728 1726882216.04132: in VariableManager get_vars() 11728 1726882216.04175: Calling all_inventory to load vars for managed_node3 11728 1726882216.04178: Calling groups_inventory to load vars for managed_node3 11728 1726882216.04180: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882216.04192: Calling all_plugins_play to load vars for managed_node3 11728 1726882216.04217: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882216.04222: Calling groups_plugins_play to load vars for managed_node3 11728 1726882216.05672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882216.07230: done with get_vars() 11728 1726882216.07254: done getting variables 11728 1726882216.07724: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:30:16 -0400 (0:00:00.073) 0:00:40.929 ****** 11728 1726882216.07757: entering _queue_task() for managed_node3/package 11728 1726882216.08178: worker is 1 (out of 1 available) 11728 1726882216.08192: exiting _queue_task() for managed_node3/package 11728 1726882216.08509: done queuing things up, now waiting for results queue to drain 11728 1726882216.08511: waiting for pending results... 11728 1726882216.08699: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11728 1726882216.09274: in run() - task 12673a56-9f93-5c28-a762-000000000976 11728 1726882216.09279: variable 'ansible_search_path' from source: unknown 11728 1726882216.09283: variable 'ansible_search_path' from source: unknown 11728 1726882216.09287: calling self._execute() 11728 1726882216.09492: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.09498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.09503: variable 'omit' from source: magic vars 11728 1726882216.10214: variable 'ansible_distribution_major_version' from source: facts 11728 1726882216.10230: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882216.10429: variable 'ansible_os_family' from source: facts 11728 1726882216.10499: Evaluated conditional (ansible_os_family == 'RedHat'): True 11728 1726882216.10792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882216.11589: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882216.11592: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882216.11695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882216.11736: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882216.11998: variable 'ansible_distribution_major_version' from source: facts 11728 1726882216.12002: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11728 1726882216.12010: variable 'omit' from source: magic vars 11728 1726882216.12058: variable 'omit' from source: magic vars 11728 1726882216.12402: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882216.16051: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882216.16132: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882216.16167: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882216.16201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882216.16241: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882216.16342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882216.16373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882216.16402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882216.16460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882216.16482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882216.16583: variable '__network_is_ostree' from source: set_fact 11728 1726882216.16592: variable 'omit' from source: magic vars 11728 1726882216.16630: variable 'omit' from source: magic vars 11728 1726882216.16666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882216.16697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882216.16719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882216.16738: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882216.16754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882216.16789: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882216.16799: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.16807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.16964: Set connection var ansible_connection to ssh 11728 1726882216.16967: Set connection var ansible_shell_executable to /bin/sh 11728 1726882216.16970: Set connection var ansible_timeout to 10 11728 1726882216.16972: Set connection var ansible_shell_type to sh 11728 1726882216.16974: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882216.16976: Set connection var ansible_pipelining to False 11728 1726882216.17001: variable 'ansible_shell_executable' from source: unknown 11728 1726882216.17008: variable 'ansible_connection' from source: unknown 11728 1726882216.17014: variable 'ansible_module_compression' from source: unknown 11728 1726882216.17019: variable 'ansible_shell_type' from source: unknown 11728 1726882216.17024: variable 'ansible_shell_executable' from source: unknown 11728 1726882216.17030: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.17036: variable 'ansible_pipelining' from source: unknown 11728 1726882216.17042: variable 'ansible_timeout' from source: unknown 11728 1726882216.17048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.17182: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882216.17185: variable 'omit' from source: magic vars 11728 1726882216.17188: starting attempt loop 11728 1726882216.17197: running the handler 11728 1726882216.17215: variable 'ansible_facts' from source: unknown 11728 1726882216.17292: variable 'ansible_facts' from source: unknown 11728 1726882216.17297: _low_level_execute_command(): starting 11728 1726882216.17300: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882216.18286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.18404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.18418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.18577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.20354: stdout chunk (state=3): >>>/root <<< 11728 1726882216.20450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.20454: stdout chunk (state=3): >>><<< 11728 1726882216.20456: stderr chunk (state=3): >>><<< 11728 1726882216.20459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.20475: _low_level_execute_command(): starting 11728 1726882216.20486: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382 `" && echo ansible-tmp-1726882216.2046282-13798-263646094828382="` echo /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382 `" ) && sleep 0' 11728 1726882216.21811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.21920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.21986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.22048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.24210: stdout chunk (state=3): >>>ansible-tmp-1726882216.2046282-13798-263646094828382=/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382 <<< 11728 1726882216.24214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.24216: stdout chunk (state=3): >>><<< 11728 1726882216.24219: stderr chunk (state=3): >>><<< 11728 1726882216.24221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882216.2046282-13798-263646094828382=/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.24223: variable 'ansible_module_compression' from source: unknown 11728 1726882216.24255: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11728 1726882216.24428: variable 'ansible_facts' from source: unknown 11728 1726882216.24811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py 11728 1726882216.25351: Sending initial data 11728 1726882216.25354: Sent initial data (152 bytes) 11728 1726882216.26473: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.26709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882216.26715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882216.26798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.26819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.26895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.28436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882216.28446: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882216.28503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882216.28551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpx6wl_af7 /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py <<< 11728 1726882216.28578: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py" <<< 11728 1726882216.28600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpx6wl_af7" to remote "/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py" <<< 11728 1726882216.28666: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py" <<< 11728 1726882216.30209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.30228: stderr chunk (state=3): >>><<< 11728 1726882216.30231: stdout chunk (state=3): >>><<< 11728 1726882216.30347: done transferring module to remote 11728 1726882216.30409: _low_level_execute_command(): starting 11728 1726882216.30417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/ /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py && sleep 0' 11728 1726882216.31686: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.31690: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882216.31692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882216.31710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.31714: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.31911: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.31941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.32013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.33961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.33964: stdout chunk (state=3): >>><<< 11728 1726882216.33967: stderr chunk (state=3): >>><<< 11728 1726882216.33969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.33972: _low_level_execute_command(): starting 11728 1726882216.33974: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/AnsiballZ_dnf.py && sleep 0' 11728 1726882216.35074: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882216.35109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.35169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882216.35280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.35299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.35356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.76272: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11728 1726882216.80249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882216.80254: stdout chunk (state=3): >>><<< 11728 1726882216.80261: stderr chunk (state=3): >>><<< 11728 1726882216.80297: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882216.80401: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882216.80405: _low_level_execute_command(): starting 11728 1726882216.80407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882216.2046282-13798-263646094828382/ > /dev/null 2>&1 && sleep 0' 11728 1726882216.81026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.81074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.81147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882216.81184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.81259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.83301: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.83304: stdout chunk (state=3): >>><<< 11728 1726882216.83307: stderr chunk (state=3): >>><<< 11728 1726882216.83309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.83312: handler run complete 11728 1726882216.83314: attempt loop complete, returning result 11728 1726882216.83317: _execute() done 11728 1726882216.83319: dumping result to json 11728 1726882216.83321: done dumping result, returning 11728 1726882216.83323: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [12673a56-9f93-5c28-a762-000000000976] 11728 1726882216.83325: sending task result for task 12673a56-9f93-5c28-a762-000000000976 11728 1726882216.83399: done sending task result for task 12673a56-9f93-5c28-a762-000000000976 11728 1726882216.83402: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11728 1726882216.83666: no more pending results, returning what we have 11728 1726882216.83670: results queue empty 11728 1726882216.83671: checking for any_errors_fatal 11728 1726882216.83676: done checking for any_errors_fatal 11728 1726882216.83677: checking for max_fail_percentage 11728 1726882216.83678: done checking for max_fail_percentage 11728 1726882216.83679: checking to see if all hosts have failed and the running result is not ok 11728 1726882216.83680: done checking to see if all hosts have failed 11728 1726882216.83680: getting the remaining hosts for this loop 11728 1726882216.83682: done getting the remaining hosts for this loop 11728 1726882216.83685: getting the next task for host managed_node3 11728 1726882216.83691: done getting next task for host managed_node3 11728 1726882216.83700: ^ task is: TASK: Create test interfaces 11728 1726882216.83704: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882216.83709: getting variables 11728 1726882216.83710: in VariableManager get_vars() 11728 1726882216.83744: Calling all_inventory to load vars for managed_node3 11728 1726882216.83747: Calling groups_inventory to load vars for managed_node3 11728 1726882216.83749: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882216.83758: Calling all_plugins_play to load vars for managed_node3 11728 1726882216.83760: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882216.83762: Calling groups_plugins_play to load vars for managed_node3 11728 1726882216.85200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882216.86698: done with get_vars() 11728 1726882216.86719: done getting variables 11728 1726882216.86771: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:30:16 -0400 (0:00:00.790) 0:00:41.720 ****** 11728 1726882216.86803: entering _queue_task() for managed_node3/shell 11728 1726882216.87094: worker is 1 (out of 1 available) 11728 1726882216.87106: exiting _queue_task() for managed_node3/shell 11728 1726882216.87117: done queuing things up, now waiting for results queue to drain 11728 1726882216.87118: waiting for pending results... 11728 1726882216.87514: running TaskExecutor() for managed_node3/TASK: Create test interfaces 11728 1726882216.87519: in run() - task 12673a56-9f93-5c28-a762-000000000977 11728 1726882216.87526: variable 'ansible_search_path' from source: unknown 11728 1726882216.87534: variable 'ansible_search_path' from source: unknown 11728 1726882216.87572: calling self._execute() 11728 1726882216.87671: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.87683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.87700: variable 'omit' from source: magic vars 11728 1726882216.88083: variable 'ansible_distribution_major_version' from source: facts 11728 1726882216.88104: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882216.88115: variable 'omit' from source: magic vars 11728 1726882216.88166: variable 'omit' from source: magic vars 11728 1726882216.88555: variable 'dhcp_interface1' from source: play vars 11728 1726882216.88567: variable 'dhcp_interface2' from source: play vars 11728 1726882216.88600: variable 'omit' from source: magic vars 11728 1726882216.88643: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882216.88682: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882216.88713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882216.88737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882216.88755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882216.88788: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882216.88800: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.88815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.89027: Set connection var ansible_connection to ssh 11728 1726882216.89029: Set connection var ansible_shell_executable to /bin/sh 11728 1726882216.89031: Set connection var ansible_timeout to 10 11728 1726882216.89033: Set connection var ansible_shell_type to sh 11728 1726882216.89035: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882216.89036: Set connection var ansible_pipelining to False 11728 1726882216.89039: variable 'ansible_shell_executable' from source: unknown 11728 1726882216.89040: variable 'ansible_connection' from source: unknown 11728 1726882216.89042: variable 'ansible_module_compression' from source: unknown 11728 1726882216.89043: variable 'ansible_shell_type' from source: unknown 11728 1726882216.89045: variable 'ansible_shell_executable' from source: unknown 11728 1726882216.89046: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882216.89048: variable 'ansible_pipelining' from source: unknown 11728 1726882216.89049: variable 'ansible_timeout' from source: unknown 11728 1726882216.89051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882216.89146: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882216.89162: variable 'omit' from source: magic vars 11728 1726882216.89169: starting attempt loop 11728 1726882216.89175: running the handler 11728 1726882216.89188: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882216.89214: _low_level_execute_command(): starting 11728 1726882216.89229: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882216.89934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.89946: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882216.90012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.90072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882216.90091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.90124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.90204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.91861: stdout chunk (state=3): >>>/root <<< 11728 1726882216.92013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.92016: stdout chunk (state=3): >>><<< 11728 1726882216.92019: stderr chunk (state=3): >>><<< 11728 1726882216.92132: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.92136: _low_level_execute_command(): starting 11728 1726882216.92138: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511 `" && echo ansible-tmp-1726882216.9204443-13842-75420557301511="` echo /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511 `" ) && sleep 0' 11728 1726882216.92709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.92747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882216.92752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882216.92809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.92875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882216.92902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.92943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.92998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.94865: stdout chunk (state=3): >>>ansible-tmp-1726882216.9204443-13842-75420557301511=/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511 <<< 11728 1726882216.95035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.95038: stdout chunk (state=3): >>><<< 11728 1726882216.95041: stderr chunk (state=3): >>><<< 11728 1726882216.95058: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882216.9204443-13842-75420557301511=/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882216.95096: variable 'ansible_module_compression' from source: unknown 11728 1726882216.95198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882216.95203: variable 'ansible_facts' from source: unknown 11728 1726882216.95300: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py 11728 1726882216.95539: Sending initial data 11728 1726882216.95543: Sent initial data (155 bytes) 11728 1726882216.96130: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.96143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.96177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882216.96190: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.96213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.96292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882216.97878: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882216.97943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882216.98030: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpgrfzzon7 /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py <<< 11728 1726882216.98033: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py" <<< 11728 1726882216.98081: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpgrfzzon7" to remote "/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py" <<< 11728 1726882216.98908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882216.99068: stderr chunk (state=3): >>><<< 11728 1726882216.99071: stdout chunk (state=3): >>><<< 11728 1726882216.99074: done transferring module to remote 11728 1726882216.99076: _low_level_execute_command(): starting 11728 1726882216.99078: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/ /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py && sleep 0' 11728 1726882216.99654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882216.99710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882216.99790: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882216.99818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882216.99896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882217.01676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882217.01680: stdout chunk (state=3): >>><<< 11728 1726882217.01789: stderr chunk (state=3): >>><<< 11728 1726882217.01795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882217.01799: _low_level_execute_command(): starting 11728 1726882217.01802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/AnsiballZ_command.py && sleep 0' 11728 1726882217.02390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882217.02411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882217.02429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882217.02538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882217.02568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882217.02658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.39833: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:30:17.176349", "end": "2024-09-20 21:30:18.395949", "delta": "0:00:01.219600", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882218.41556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882218.41560: stdout chunk (state=3): >>><<< 11728 1726882218.41562: stderr chunk (state=3): >>><<< 11728 1726882218.41565: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 711 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:30:17.176349", "end": "2024-09-20 21:30:18.395949", "delta": "0:00:01.219600", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882218.41591: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882218.41609: _low_level_execute_command(): starting 11728 1726882218.41621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882216.9204443-13842-75420557301511/ > /dev/null 2>&1 && sleep 0' 11728 1726882218.42510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882218.42525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882218.42541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882218.42558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882218.42583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882218.42770: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882218.42911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.43033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.44913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882218.44917: stdout chunk (state=3): >>><<< 11728 1726882218.44919: stderr chunk (state=3): >>><<< 11728 1726882218.44944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882218.44956: handler run complete 11728 1726882218.44983: Evaluated conditional (False): False 11728 1726882218.44999: attempt loop complete, returning result 11728 1726882218.45006: _execute() done 11728 1726882218.45013: dumping result to json 11728 1726882218.45022: done dumping result, returning 11728 1726882218.45039: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [12673a56-9f93-5c28-a762-000000000977] 11728 1726882218.45403: sending task result for task 12673a56-9f93-5c28-a762-000000000977 11728 1726882218.45478: done sending task result for task 12673a56-9f93-5c28-a762-000000000977 11728 1726882218.45481: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.219600", "end": "2024-09-20 21:30:18.395949", "rc": 0, "start": "2024-09-20 21:30:17.176349" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 711 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 711 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11728 1726882218.45773: no more pending results, returning what we have 11728 1726882218.45777: results queue empty 11728 1726882218.45778: checking for any_errors_fatal 11728 1726882218.45785: done checking for any_errors_fatal 11728 1726882218.45786: checking for max_fail_percentage 11728 1726882218.45787: done checking for max_fail_percentage 11728 1726882218.45788: checking to see if all hosts have failed and the running result is not ok 11728 1726882218.45789: done checking to see if all hosts have failed 11728 1726882218.45790: getting the remaining hosts for this loop 11728 1726882218.45791: done getting the remaining hosts for this loop 11728 1726882218.45797: getting the next task for host managed_node3 11728 1726882218.45807: done getting next task for host managed_node3 11728 1726882218.45810: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11728 1726882218.45814: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882218.45817: getting variables 11728 1726882218.45818: in VariableManager get_vars() 11728 1726882218.45852: Calling all_inventory to load vars for managed_node3 11728 1726882218.45855: Calling groups_inventory to load vars for managed_node3 11728 1726882218.45858: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882218.45867: Calling all_plugins_play to load vars for managed_node3 11728 1726882218.45870: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882218.45873: Calling groups_plugins_play to load vars for managed_node3 11728 1726882218.47990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882218.57439: done with get_vars() 11728 1726882218.57470: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:30:18 -0400 (0:00:01.708) 0:00:43.428 ****** 11728 1726882218.57631: entering _queue_task() for managed_node3/include_tasks 11728 1726882218.58364: worker is 1 (out of 1 available) 11728 1726882218.58376: exiting _queue_task() for managed_node3/include_tasks 11728 1726882218.58388: done queuing things up, now waiting for results queue to drain 11728 1726882218.58390: waiting for pending results... 11728 1726882218.58924: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11728 1726882218.59083: in run() - task 12673a56-9f93-5c28-a762-00000000097e 11728 1726882218.59113: variable 'ansible_search_path' from source: unknown 11728 1726882218.59124: variable 'ansible_search_path' from source: unknown 11728 1726882218.59171: calling self._execute() 11728 1726882218.59278: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882218.59295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882218.59334: variable 'omit' from source: magic vars 11728 1726882218.59771: variable 'ansible_distribution_major_version' from source: facts 11728 1726882218.60011: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882218.60015: _execute() done 11728 1726882218.60018: dumping result to json 11728 1726882218.60021: done dumping result, returning 11728 1726882218.60024: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-5c28-a762-00000000097e] 11728 1726882218.60026: sending task result for task 12673a56-9f93-5c28-a762-00000000097e 11728 1726882218.60108: done sending task result for task 12673a56-9f93-5c28-a762-00000000097e 11728 1726882218.60226: WORKER PROCESS EXITING 11728 1726882218.60256: no more pending results, returning what we have 11728 1726882218.60261: in VariableManager get_vars() 11728 1726882218.60314: Calling all_inventory to load vars for managed_node3 11728 1726882218.60317: Calling groups_inventory to load vars for managed_node3 11728 1726882218.60320: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882218.60343: Calling all_plugins_play to load vars for managed_node3 11728 1726882218.60347: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882218.60399: Calling groups_plugins_play to load vars for managed_node3 11728 1726882218.62332: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882218.63935: done with get_vars() 11728 1726882218.63953: variable 'ansible_search_path' from source: unknown 11728 1726882218.63954: variable 'ansible_search_path' from source: unknown 11728 1726882218.63991: we have included files to process 11728 1726882218.63995: generating all_blocks data 11728 1726882218.63997: done generating all_blocks data 11728 1726882218.64004: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882218.64005: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882218.64008: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882218.64209: done processing included file 11728 1726882218.64211: iterating over new_blocks loaded from include file 11728 1726882218.64213: in VariableManager get_vars() 11728 1726882218.64235: done with get_vars() 11728 1726882218.64237: filtering new block on tags 11728 1726882218.64273: done filtering new block on tags 11728 1726882218.64276: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11728 1726882218.64281: extending task lists for all hosts with included blocks 11728 1726882218.64503: done extending task lists 11728 1726882218.64504: done processing included files 11728 1726882218.64505: results queue empty 11728 1726882218.64506: checking for any_errors_fatal 11728 1726882218.64513: done checking for any_errors_fatal 11728 1726882218.64513: checking for max_fail_percentage 11728 1726882218.64514: done checking for max_fail_percentage 11728 1726882218.64515: checking to see if all hosts have failed and the running result is not ok 11728 1726882218.64516: done checking to see if all hosts have failed 11728 1726882218.64517: getting the remaining hosts for this loop 11728 1726882218.64518: done getting the remaining hosts for this loop 11728 1726882218.64520: getting the next task for host managed_node3 11728 1726882218.64525: done getting next task for host managed_node3 11728 1726882218.64526: ^ task is: TASK: Get stat for interface {{ interface }} 11728 1726882218.64531: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882218.64533: getting variables 11728 1726882218.64534: in VariableManager get_vars() 11728 1726882218.64546: Calling all_inventory to load vars for managed_node3 11728 1726882218.64548: Calling groups_inventory to load vars for managed_node3 11728 1726882218.64550: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882218.64556: Calling all_plugins_play to load vars for managed_node3 11728 1726882218.64558: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882218.64561: Calling groups_plugins_play to load vars for managed_node3 11728 1726882218.66666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882218.68526: done with get_vars() 11728 1726882218.68556: done getting variables 11728 1726882218.68759: variable 'interface' from source: task vars 11728 1726882218.68763: variable 'dhcp_interface1' from source: play vars 11728 1726882218.69024: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:30:18 -0400 (0:00:00.114) 0:00:43.543 ****** 11728 1726882218.69059: entering _queue_task() for managed_node3/stat 11728 1726882218.69836: worker is 1 (out of 1 available) 11728 1726882218.69848: exiting _queue_task() for managed_node3/stat 11728 1726882218.69858: done queuing things up, now waiting for results queue to drain 11728 1726882218.69860: waiting for pending results... 11728 1726882218.70148: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 11728 1726882218.70389: in run() - task 12673a56-9f93-5c28-a762-0000000009dd 11728 1726882218.70469: variable 'ansible_search_path' from source: unknown 11728 1726882218.70473: variable 'ansible_search_path' from source: unknown 11728 1726882218.70478: calling self._execute() 11728 1726882218.70599: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882218.70702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882218.70718: variable 'omit' from source: magic vars 11728 1726882218.71297: variable 'ansible_distribution_major_version' from source: facts 11728 1726882218.71401: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882218.71405: variable 'omit' from source: magic vars 11728 1726882218.71407: variable 'omit' from source: magic vars 11728 1726882218.71589: variable 'interface' from source: task vars 11728 1726882218.71726: variable 'dhcp_interface1' from source: play vars 11728 1726882218.71779: variable 'dhcp_interface1' from source: play vars 11728 1726882218.71809: variable 'omit' from source: magic vars 11728 1726882218.71874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882218.71981: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882218.72074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882218.72102: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882218.72120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882218.72192: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882218.72239: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882218.72249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882218.72369: Set connection var ansible_connection to ssh 11728 1726882218.72409: Set connection var ansible_shell_executable to /bin/sh 11728 1726882218.72438: Set connection var ansible_timeout to 10 11728 1726882218.72488: Set connection var ansible_shell_type to sh 11728 1726882218.72491: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882218.72495: Set connection var ansible_pipelining to False 11728 1726882218.72498: variable 'ansible_shell_executable' from source: unknown 11728 1726882218.72505: variable 'ansible_connection' from source: unknown 11728 1726882218.72514: variable 'ansible_module_compression' from source: unknown 11728 1726882218.72522: variable 'ansible_shell_type' from source: unknown 11728 1726882218.72529: variable 'ansible_shell_executable' from source: unknown 11728 1726882218.72544: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882218.72601: variable 'ansible_pipelining' from source: unknown 11728 1726882218.72605: variable 'ansible_timeout' from source: unknown 11728 1726882218.72608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882218.72781: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882218.72802: variable 'omit' from source: magic vars 11728 1726882218.72835: starting attempt loop 11728 1726882218.72844: running the handler 11728 1726882218.72863: _low_level_execute_command(): starting 11728 1726882218.72898: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882218.73706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882218.73765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.73812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.75534: stdout chunk (state=3): >>>/root <<< 11728 1726882218.75689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882218.76124: stderr chunk (state=3): >>><<< 11728 1726882218.76129: stdout chunk (state=3): >>><<< 11728 1726882218.76137: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882218.76142: _low_level_execute_command(): starting 11728 1726882218.76145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024 `" && echo ansible-tmp-1726882218.7602775-13937-28254446301024="` echo /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024 `" ) && sleep 0' 11728 1726882218.77273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882218.77303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882218.77358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882218.77491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882218.77523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882218.77622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882218.77659: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882218.77715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.77766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.79884: stdout chunk (state=3): >>>ansible-tmp-1726882218.7602775-13937-28254446301024=/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024 <<< 11728 1726882218.79888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882218.79891: stdout chunk (state=3): >>><<< 11728 1726882218.79895: stderr chunk (state=3): >>><<< 11728 1726882218.79898: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882218.7602775-13937-28254446301024=/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882218.79995: variable 'ansible_module_compression' from source: unknown 11728 1726882218.80305: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882218.80309: variable 'ansible_facts' from source: unknown 11728 1726882218.80371: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py 11728 1726882218.80653: Sending initial data 11728 1726882218.80656: Sent initial data (152 bytes) 11728 1726882218.81421: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882218.81436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882218.81520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882218.81564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882218.81588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882218.81643: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.81858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.83302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882218.83344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882218.83382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmprl81vy9g /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py <<< 11728 1726882218.83392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py" <<< 11728 1726882218.83637: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmprl81vy9g" to remote "/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py" <<< 11728 1726882218.84874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882218.85056: stderr chunk (state=3): >>><<< 11728 1726882218.85059: stdout chunk (state=3): >>><<< 11728 1726882218.85062: done transferring module to remote 11728 1726882218.85064: _low_level_execute_command(): starting 11728 1726882218.85066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/ /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py && sleep 0' 11728 1726882218.86330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882218.86599: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882218.86812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.86886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882218.88714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882218.88760: stderr chunk (state=3): >>><<< 11728 1726882218.88766: stdout chunk (state=3): >>><<< 11728 1726882218.88870: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882218.88874: _low_level_execute_command(): starting 11728 1726882218.88876: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/AnsiballZ_stat.py && sleep 0' 11728 1726882218.89951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882218.90227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882218.90278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.05480: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28714, "dev": 23, "nlink": 1, "atime": 1726882217.1828077, "mtime": 1726882217.1828077, "ctime": 1726882217.1828077, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882219.06828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882219.06833: stdout chunk (state=3): >>><<< 11728 1726882219.06836: stderr chunk (state=3): >>><<< 11728 1726882219.06902: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28714, "dev": 23, "nlink": 1, "atime": 1726882217.1828077, "mtime": 1726882217.1828077, "ctime": 1726882217.1828077, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882219.06916: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882219.06955: _low_level_execute_command(): starting 11728 1726882219.06959: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882218.7602775-13937-28254446301024/ > /dev/null 2>&1 && sleep 0' 11728 1726882219.08032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882219.08497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882219.08507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.08510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.08512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.10233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.10282: stderr chunk (state=3): >>><<< 11728 1726882219.10500: stdout chunk (state=3): >>><<< 11728 1726882219.10504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882219.10507: handler run complete 11728 1726882219.10510: attempt loop complete, returning result 11728 1726882219.10513: _execute() done 11728 1726882219.10582: dumping result to json 11728 1726882219.10585: done dumping result, returning 11728 1726882219.10588: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [12673a56-9f93-5c28-a762-0000000009dd] 11728 1726882219.10590: sending task result for task 12673a56-9f93-5c28-a762-0000000009dd 11728 1726882219.10968: done sending task result for task 12673a56-9f93-5c28-a762-0000000009dd 11728 1726882219.10971: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882217.1828077, "block_size": 4096, "blocks": 0, "ctime": 1726882217.1828077, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28714, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882217.1828077, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11728 1726882219.11181: no more pending results, returning what we have 11728 1726882219.11185: results queue empty 11728 1726882219.11186: checking for any_errors_fatal 11728 1726882219.11188: done checking for any_errors_fatal 11728 1726882219.11188: checking for max_fail_percentage 11728 1726882219.11199: done checking for max_fail_percentage 11728 1726882219.11201: checking to see if all hosts have failed and the running result is not ok 11728 1726882219.11202: done checking to see if all hosts have failed 11728 1726882219.11203: getting the remaining hosts for this loop 11728 1726882219.11205: done getting the remaining hosts for this loop 11728 1726882219.11208: getting the next task for host managed_node3 11728 1726882219.11217: done getting next task for host managed_node3 11728 1726882219.11221: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11728 1726882219.11225: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882219.11230: getting variables 11728 1726882219.11232: in VariableManager get_vars() 11728 1726882219.11271: Calling all_inventory to load vars for managed_node3 11728 1726882219.11274: Calling groups_inventory to load vars for managed_node3 11728 1726882219.11276: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882219.11288: Calling all_plugins_play to load vars for managed_node3 11728 1726882219.11292: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882219.11529: Calling groups_plugins_play to load vars for managed_node3 11728 1726882219.14384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882219.17858: done with get_vars() 11728 1726882219.17890: done getting variables 11728 1726882219.18042: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882219.18214: variable 'interface' from source: task vars 11728 1726882219.18218: variable 'dhcp_interface1' from source: play vars 11728 1726882219.18392: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:30:19 -0400 (0:00:00.493) 0:00:44.036 ****** 11728 1726882219.18431: entering _queue_task() for managed_node3/assert 11728 1726882219.19344: worker is 1 (out of 1 available) 11728 1726882219.19359: exiting _queue_task() for managed_node3/assert 11728 1726882219.19371: done queuing things up, now waiting for results queue to drain 11728 1726882219.19373: waiting for pending results... 11728 1726882219.19807: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 11728 1726882219.20151: in run() - task 12673a56-9f93-5c28-a762-00000000097f 11728 1726882219.20170: variable 'ansible_search_path' from source: unknown 11728 1726882219.20177: variable 'ansible_search_path' from source: unknown 11728 1726882219.20219: calling self._execute() 11728 1726882219.20569: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.20573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.20578: variable 'omit' from source: magic vars 11728 1726882219.21401: variable 'ansible_distribution_major_version' from source: facts 11728 1726882219.21405: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882219.21407: variable 'omit' from source: magic vars 11728 1726882219.21410: variable 'omit' from source: magic vars 11728 1726882219.21679: variable 'interface' from source: task vars 11728 1726882219.21898: variable 'dhcp_interface1' from source: play vars 11728 1726882219.21901: variable 'dhcp_interface1' from source: play vars 11728 1726882219.21904: variable 'omit' from source: magic vars 11728 1726882219.21906: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882219.21908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882219.21910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882219.21912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882219.22101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882219.22133: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882219.22142: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.22149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.22242: Set connection var ansible_connection to ssh 11728 1726882219.22598: Set connection var ansible_shell_executable to /bin/sh 11728 1726882219.22601: Set connection var ansible_timeout to 10 11728 1726882219.22603: Set connection var ansible_shell_type to sh 11728 1726882219.22605: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882219.22608: Set connection var ansible_pipelining to False 11728 1726882219.22610: variable 'ansible_shell_executable' from source: unknown 11728 1726882219.22612: variable 'ansible_connection' from source: unknown 11728 1726882219.22615: variable 'ansible_module_compression' from source: unknown 11728 1726882219.22618: variable 'ansible_shell_type' from source: unknown 11728 1726882219.22620: variable 'ansible_shell_executable' from source: unknown 11728 1726882219.22623: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.22625: variable 'ansible_pipelining' from source: unknown 11728 1726882219.22628: variable 'ansible_timeout' from source: unknown 11728 1726882219.22630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.22741: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882219.23098: variable 'omit' from source: magic vars 11728 1726882219.23101: starting attempt loop 11728 1726882219.23104: running the handler 11728 1726882219.23154: variable 'interface_stat' from source: set_fact 11728 1726882219.23177: Evaluated conditional (interface_stat.stat.exists): True 11728 1726882219.23188: handler run complete 11728 1726882219.23207: attempt loop complete, returning result 11728 1726882219.23214: _execute() done 11728 1726882219.23221: dumping result to json 11728 1726882219.23228: done dumping result, returning 11728 1726882219.23238: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [12673a56-9f93-5c28-a762-00000000097f] 11728 1726882219.23247: sending task result for task 12673a56-9f93-5c28-a762-00000000097f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882219.23447: no more pending results, returning what we have 11728 1726882219.23450: results queue empty 11728 1726882219.23451: checking for any_errors_fatal 11728 1726882219.23461: done checking for any_errors_fatal 11728 1726882219.23461: checking for max_fail_percentage 11728 1726882219.23463: done checking for max_fail_percentage 11728 1726882219.23464: checking to see if all hosts have failed and the running result is not ok 11728 1726882219.23465: done checking to see if all hosts have failed 11728 1726882219.23466: getting the remaining hosts for this loop 11728 1726882219.23467: done getting the remaining hosts for this loop 11728 1726882219.23471: getting the next task for host managed_node3 11728 1726882219.23481: done getting next task for host managed_node3 11728 1726882219.23484: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11728 1726882219.23491: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882219.23500: getting variables 11728 1726882219.23502: in VariableManager get_vars() 11728 1726882219.23541: Calling all_inventory to load vars for managed_node3 11728 1726882219.23544: Calling groups_inventory to load vars for managed_node3 11728 1726882219.23546: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882219.23556: Calling all_plugins_play to load vars for managed_node3 11728 1726882219.23558: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882219.23561: Calling groups_plugins_play to load vars for managed_node3 11728 1726882219.24801: done sending task result for task 12673a56-9f93-5c28-a762-00000000097f 11728 1726882219.24806: WORKER PROCESS EXITING 11728 1726882219.26822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882219.29703: done with get_vars() 11728 1726882219.29727: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:30:19 -0400 (0:00:00.115) 0:00:44.152 ****** 11728 1726882219.30033: entering _queue_task() for managed_node3/include_tasks 11728 1726882219.30568: worker is 1 (out of 1 available) 11728 1726882219.30581: exiting _queue_task() for managed_node3/include_tasks 11728 1726882219.30796: done queuing things up, now waiting for results queue to drain 11728 1726882219.30798: waiting for pending results... 11728 1726882219.31615: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11728 1726882219.31622: in run() - task 12673a56-9f93-5c28-a762-000000000983 11728 1726882219.32065: variable 'ansible_search_path' from source: unknown 11728 1726882219.32069: variable 'ansible_search_path' from source: unknown 11728 1726882219.32072: calling self._execute() 11728 1726882219.32232: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.32291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.32311: variable 'omit' from source: magic vars 11728 1726882219.33889: variable 'ansible_distribution_major_version' from source: facts 11728 1726882219.33897: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882219.33900: _execute() done 11728 1726882219.33902: dumping result to json 11728 1726882219.33905: done dumping result, returning 11728 1726882219.33907: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-5c28-a762-000000000983] 11728 1726882219.33910: sending task result for task 12673a56-9f93-5c28-a762-000000000983 11728 1726882219.33980: done sending task result for task 12673a56-9f93-5c28-a762-000000000983 11728 1726882219.33983: WORKER PROCESS EXITING 11728 1726882219.34223: no more pending results, returning what we have 11728 1726882219.34228: in VariableManager get_vars() 11728 1726882219.34276: Calling all_inventory to load vars for managed_node3 11728 1726882219.34279: Calling groups_inventory to load vars for managed_node3 11728 1726882219.34281: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882219.34297: Calling all_plugins_play to load vars for managed_node3 11728 1726882219.34301: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882219.34304: Calling groups_plugins_play to load vars for managed_node3 11728 1726882219.36807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882219.39746: done with get_vars() 11728 1726882219.39769: variable 'ansible_search_path' from source: unknown 11728 1726882219.39770: variable 'ansible_search_path' from source: unknown 11728 1726882219.40014: we have included files to process 11728 1726882219.40015: generating all_blocks data 11728 1726882219.40017: done generating all_blocks data 11728 1726882219.40021: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882219.40023: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882219.40025: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11728 1726882219.40410: done processing included file 11728 1726882219.40412: iterating over new_blocks loaded from include file 11728 1726882219.40414: in VariableManager get_vars() 11728 1726882219.40436: done with get_vars() 11728 1726882219.40438: filtering new block on tags 11728 1726882219.40467: done filtering new block on tags 11728 1726882219.40469: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11728 1726882219.40474: extending task lists for all hosts with included blocks 11728 1726882219.40889: done extending task lists 11728 1726882219.40891: done processing included files 11728 1726882219.40892: results queue empty 11728 1726882219.40892: checking for any_errors_fatal 11728 1726882219.40898: done checking for any_errors_fatal 11728 1726882219.40898: checking for max_fail_percentage 11728 1726882219.40899: done checking for max_fail_percentage 11728 1726882219.40900: checking to see if all hosts have failed and the running result is not ok 11728 1726882219.40901: done checking to see if all hosts have failed 11728 1726882219.40902: getting the remaining hosts for this loop 11728 1726882219.40903: done getting the remaining hosts for this loop 11728 1726882219.40905: getting the next task for host managed_node3 11728 1726882219.40910: done getting next task for host managed_node3 11728 1726882219.40912: ^ task is: TASK: Get stat for interface {{ interface }} 11728 1726882219.40916: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882219.40918: getting variables 11728 1726882219.40919: in VariableManager get_vars() 11728 1726882219.40932: Calling all_inventory to load vars for managed_node3 11728 1726882219.40934: Calling groups_inventory to load vars for managed_node3 11728 1726882219.40936: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882219.40941: Calling all_plugins_play to load vars for managed_node3 11728 1726882219.40944: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882219.40947: Calling groups_plugins_play to load vars for managed_node3 11728 1726882219.43616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882219.46569: done with get_vars() 11728 1726882219.46802: done getting variables 11728 1726882219.47067: variable 'interface' from source: task vars 11728 1726882219.47071: variable 'dhcp_interface2' from source: play vars 11728 1726882219.47234: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:30:19 -0400 (0:00:00.172) 0:00:44.325 ****** 11728 1726882219.47270: entering _queue_task() for managed_node3/stat 11728 1726882219.48024: worker is 1 (out of 1 available) 11728 1726882219.48037: exiting _queue_task() for managed_node3/stat 11728 1726882219.48050: done queuing things up, now waiting for results queue to drain 11728 1726882219.48051: waiting for pending results... 11728 1726882219.48950: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 11728 1726882219.49596: in run() - task 12673a56-9f93-5c28-a762-000000000a01 11728 1726882219.49600: variable 'ansible_search_path' from source: unknown 11728 1726882219.49604: variable 'ansible_search_path' from source: unknown 11728 1726882219.49608: calling self._execute() 11728 1726882219.49688: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.50119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.50124: variable 'omit' from source: magic vars 11728 1726882219.50872: variable 'ansible_distribution_major_version' from source: facts 11728 1726882219.51503: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882219.51506: variable 'omit' from source: magic vars 11728 1726882219.51509: variable 'omit' from source: magic vars 11728 1726882219.51511: variable 'interface' from source: task vars 11728 1726882219.51513: variable 'dhcp_interface2' from source: play vars 11728 1726882219.51770: variable 'dhcp_interface2' from source: play vars 11728 1726882219.51924: variable 'omit' from source: magic vars 11728 1726882219.51974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882219.52190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882219.52287: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882219.52590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882219.52598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882219.52601: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882219.52603: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.52605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.52672: Set connection var ansible_connection to ssh 11728 1726882219.53027: Set connection var ansible_shell_executable to /bin/sh 11728 1726882219.53030: Set connection var ansible_timeout to 10 11728 1726882219.53033: Set connection var ansible_shell_type to sh 11728 1726882219.53035: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882219.53037: Set connection var ansible_pipelining to False 11728 1726882219.53039: variable 'ansible_shell_executable' from source: unknown 11728 1726882219.53041: variable 'ansible_connection' from source: unknown 11728 1726882219.53044: variable 'ansible_module_compression' from source: unknown 11728 1726882219.53046: variable 'ansible_shell_type' from source: unknown 11728 1726882219.53048: variable 'ansible_shell_executable' from source: unknown 11728 1726882219.53049: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882219.53051: variable 'ansible_pipelining' from source: unknown 11728 1726882219.53054: variable 'ansible_timeout' from source: unknown 11728 1726882219.53056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882219.53709: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882219.53911: variable 'omit' from source: magic vars 11728 1726882219.53924: starting attempt loop 11728 1726882219.53932: running the handler 11728 1726882219.53952: _low_level_execute_command(): starting 11728 1726882219.54013: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882219.55619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882219.56011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.57600: stdout chunk (state=3): >>>/root <<< 11728 1726882219.57689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.58023: stderr chunk (state=3): >>><<< 11728 1726882219.58026: stdout chunk (state=3): >>><<< 11728 1726882219.58140: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882219.58144: _low_level_execute_command(): starting 11728 1726882219.58147: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247 `" && echo ansible-tmp-1726882219.580491-13961-227942549209247="` echo /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247 `" ) && sleep 0' 11728 1726882219.59410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.59513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.59589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.61478: stdout chunk (state=3): >>>ansible-tmp-1726882219.580491-13961-227942549209247=/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247 <<< 11728 1726882219.61581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.61617: stderr chunk (state=3): >>><<< 11728 1726882219.61879: stdout chunk (state=3): >>><<< 11728 1726882219.61882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882219.580491-13961-227942549209247=/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882219.61884: variable 'ansible_module_compression' from source: unknown 11728 1726882219.62201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11728 1726882219.62204: variable 'ansible_facts' from source: unknown 11728 1726882219.62273: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py 11728 1726882219.62825: Sending initial data 11728 1726882219.62828: Sent initial data (152 bytes) 11728 1726882219.64213: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882219.64343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882219.64368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.64385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.64756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.66439: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882219.66514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882219.66552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbvd71qan /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py <<< 11728 1726882219.66576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py" <<< 11728 1726882219.66624: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpbvd71qan" to remote "/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py" <<< 11728 1726882219.66743: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py" <<< 11728 1726882219.68002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.68005: stdout chunk (state=3): >>><<< 11728 1726882219.68007: stderr chunk (state=3): >>><<< 11728 1726882219.68065: done transferring module to remote 11728 1726882219.68203: _low_level_execute_command(): starting 11728 1726882219.68206: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/ /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py && sleep 0' 11728 1726882219.69819: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.70236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.70335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.72119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.72143: stdout chunk (state=3): >>><<< 11728 1726882219.72146: stderr chunk (state=3): >>><<< 11728 1726882219.72229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882219.72239: _low_level_execute_command(): starting 11728 1726882219.72242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/AnsiballZ_stat.py && sleep 0' 11728 1726882219.73406: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882219.73418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882219.73764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882219.74082: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.74120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.74207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.89153: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29120, "dev": 23, "nlink": 1, "atime": 1726882217.1880872, "mtime": 1726882217.1880872, "ctime": 1726882217.1880872, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11728 1726882219.90517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882219.90527: stdout chunk (state=3): >>><<< 11728 1726882219.90538: stderr chunk (state=3): >>><<< 11728 1726882219.90562: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29120, "dev": 23, "nlink": 1, "atime": 1726882217.1880872, "mtime": 1726882217.1880872, "ctime": 1726882217.1880872, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882219.90664: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882219.90734: _low_level_execute_command(): starting 11728 1726882219.90745: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882219.580491-13961-227942549209247/ > /dev/null 2>&1 && sleep 0' 11728 1726882219.92053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882219.92197: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882219.92282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882219.92437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882219.92490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882219.94331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882219.94341: stdout chunk (state=3): >>><<< 11728 1726882219.94358: stderr chunk (state=3): >>><<< 11728 1726882219.94602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882219.94609: handler run complete 11728 1726882219.94612: attempt loop complete, returning result 11728 1726882219.94614: _execute() done 11728 1726882219.94616: dumping result to json 11728 1726882219.94618: done dumping result, returning 11728 1726882219.94620: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [12673a56-9f93-5c28-a762-000000000a01] 11728 1726882219.94622: sending task result for task 12673a56-9f93-5c28-a762-000000000a01 ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726882217.1880872, "block_size": 4096, "blocks": 0, "ctime": 1726882217.1880872, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29120, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882217.1880872, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11728 1726882219.94977: no more pending results, returning what we have 11728 1726882219.94981: results queue empty 11728 1726882219.94983: checking for any_errors_fatal 11728 1726882219.94984: done checking for any_errors_fatal 11728 1726882219.94985: checking for max_fail_percentage 11728 1726882219.94987: done checking for max_fail_percentage 11728 1726882219.94988: checking to see if all hosts have failed and the running result is not ok 11728 1726882219.94988: done checking to see if all hosts have failed 11728 1726882219.94989: getting the remaining hosts for this loop 11728 1726882219.94991: done getting the remaining hosts for this loop 11728 1726882219.94997: getting the next task for host managed_node3 11728 1726882219.95008: done getting next task for host managed_node3 11728 1726882219.95011: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11728 1726882219.95016: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882219.95021: getting variables 11728 1726882219.95022: in VariableManager get_vars() 11728 1726882219.95063: Calling all_inventory to load vars for managed_node3 11728 1726882219.95066: Calling groups_inventory to load vars for managed_node3 11728 1726882219.95069: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882219.95081: Calling all_plugins_play to load vars for managed_node3 11728 1726882219.95084: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882219.95087: Calling groups_plugins_play to load vars for managed_node3 11728 1726882219.96129: done sending task result for task 12673a56-9f93-5c28-a762-000000000a01 11728 1726882219.96132: WORKER PROCESS EXITING 11728 1726882219.98338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.02023: done with get_vars() 11728 1726882220.02047: done getting variables 11728 1726882220.02214: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882220.02335: variable 'interface' from source: task vars 11728 1726882220.02453: variable 'dhcp_interface2' from source: play vars 11728 1726882220.02578: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:30:20 -0400 (0:00:00.553) 0:00:44.878 ****** 11728 1726882220.02628: entering _queue_task() for managed_node3/assert 11728 1726882220.03369: worker is 1 (out of 1 available) 11728 1726882220.03382: exiting _queue_task() for managed_node3/assert 11728 1726882220.03396: done queuing things up, now waiting for results queue to drain 11728 1726882220.03398: waiting for pending results... 11728 1726882220.04199: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 11728 1726882220.04670: in run() - task 12673a56-9f93-5c28-a762-000000000984 11728 1726882220.04684: variable 'ansible_search_path' from source: unknown 11728 1726882220.04688: variable 'ansible_search_path' from source: unknown 11728 1726882220.04954: calling self._execute() 11728 1726882220.05315: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.05318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.05331: variable 'omit' from source: magic vars 11728 1726882220.06729: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.06753: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.06765: variable 'omit' from source: magic vars 11728 1726882220.06908: variable 'omit' from source: magic vars 11728 1726882220.07101: variable 'interface' from source: task vars 11728 1726882220.07113: variable 'dhcp_interface2' from source: play vars 11728 1726882220.07185: variable 'dhcp_interface2' from source: play vars 11728 1726882220.07214: variable 'omit' from source: magic vars 11728 1726882220.07264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882220.07310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882220.07334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882220.07360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882220.07380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882220.07421: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882220.07430: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.07438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.07548: Set connection var ansible_connection to ssh 11728 1726882220.07565: Set connection var ansible_shell_executable to /bin/sh 11728 1726882220.07582: Set connection var ansible_timeout to 10 11728 1726882220.07589: Set connection var ansible_shell_type to sh 11728 1726882220.07687: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882220.07696: Set connection var ansible_pipelining to False 11728 1726882220.07699: variable 'ansible_shell_executable' from source: unknown 11728 1726882220.07701: variable 'ansible_connection' from source: unknown 11728 1726882220.07702: variable 'ansible_module_compression' from source: unknown 11728 1726882220.07704: variable 'ansible_shell_type' from source: unknown 11728 1726882220.07706: variable 'ansible_shell_executable' from source: unknown 11728 1726882220.07708: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.07710: variable 'ansible_pipelining' from source: unknown 11728 1726882220.07712: variable 'ansible_timeout' from source: unknown 11728 1726882220.07714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.07833: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882220.07852: variable 'omit' from source: magic vars 11728 1726882220.07863: starting attempt loop 11728 1726882220.07871: running the handler 11728 1726882220.08022: variable 'interface_stat' from source: set_fact 11728 1726882220.08049: Evaluated conditional (interface_stat.stat.exists): True 11728 1726882220.08060: handler run complete 11728 1726882220.08078: attempt loop complete, returning result 11728 1726882220.08084: _execute() done 11728 1726882220.08091: dumping result to json 11728 1726882220.08102: done dumping result, returning 11728 1726882220.08127: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [12673a56-9f93-5c28-a762-000000000984] 11728 1726882220.08129: sending task result for task 12673a56-9f93-5c28-a762-000000000984 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11728 1726882220.08280: no more pending results, returning what we have 11728 1726882220.08285: results queue empty 11728 1726882220.08286: checking for any_errors_fatal 11728 1726882220.08300: done checking for any_errors_fatal 11728 1726882220.08301: checking for max_fail_percentage 11728 1726882220.08303: done checking for max_fail_percentage 11728 1726882220.08304: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.08305: done checking to see if all hosts have failed 11728 1726882220.08306: getting the remaining hosts for this loop 11728 1726882220.08308: done getting the remaining hosts for this loop 11728 1726882220.08311: getting the next task for host managed_node3 11728 1726882220.08321: done getting next task for host managed_node3 11728 1726882220.08324: ^ task is: TASK: Test 11728 1726882220.08327: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.08334: getting variables 11728 1726882220.08336: in VariableManager get_vars() 11728 1726882220.08378: Calling all_inventory to load vars for managed_node3 11728 1726882220.08381: Calling groups_inventory to load vars for managed_node3 11728 1726882220.08384: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.08502: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.08507: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.08512: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.09131: done sending task result for task 12673a56-9f93-5c28-a762-000000000984 11728 1726882220.09135: WORKER PROCESS EXITING 11728 1726882220.10473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.13632: done with get_vars() 11728 1726882220.13657: done getting variables TASK [Test] ******************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:30 Friday 20 September 2024 21:30:20 -0400 (0:00:00.113) 0:00:44.992 ****** 11728 1726882220.13964: entering _queue_task() for managed_node3/include_tasks 11728 1726882220.14929: worker is 1 (out of 1 available) 11728 1726882220.14940: exiting _queue_task() for managed_node3/include_tasks 11728 1726882220.14950: done queuing things up, now waiting for results queue to drain 11728 1726882220.14952: waiting for pending results... 11728 1726882220.15174: running TaskExecutor() for managed_node3/TASK: Test 11728 1726882220.15395: in run() - task 12673a56-9f93-5c28-a762-0000000008ee 11728 1726882220.15609: variable 'ansible_search_path' from source: unknown 11728 1726882220.15613: variable 'ansible_search_path' from source: unknown 11728 1726882220.15616: variable 'lsr_test' from source: include params 11728 1726882220.16000: variable 'lsr_test' from source: include params 11728 1726882220.16129: variable 'omit' from source: magic vars 11728 1726882220.16608: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.16612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.16615: variable 'omit' from source: magic vars 11728 1726882220.16918: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.17010: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.17022: variable 'item' from source: unknown 11728 1726882220.17095: variable 'item' from source: unknown 11728 1726882220.17183: variable 'item' from source: unknown 11728 1726882220.17320: variable 'item' from source: unknown 11728 1726882220.17627: dumping result to json 11728 1726882220.17630: done dumping result, returning 11728 1726882220.17632: done running TaskExecutor() for managed_node3/TASK: Test [12673a56-9f93-5c28-a762-0000000008ee] 11728 1726882220.17635: sending task result for task 12673a56-9f93-5c28-a762-0000000008ee 11728 1726882220.17677: done sending task result for task 12673a56-9f93-5c28-a762-0000000008ee 11728 1726882220.17680: WORKER PROCESS EXITING 11728 1726882220.17754: no more pending results, returning what we have 11728 1726882220.17761: in VariableManager get_vars() 11728 1726882220.17812: Calling all_inventory to load vars for managed_node3 11728 1726882220.17815: Calling groups_inventory to load vars for managed_node3 11728 1726882220.17818: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.17833: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.17836: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.17840: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.21068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.23964: done with get_vars() 11728 1726882220.23990: variable 'ansible_search_path' from source: unknown 11728 1726882220.23991: variable 'ansible_search_path' from source: unknown 11728 1726882220.24236: we have included files to process 11728 1726882220.24237: generating all_blocks data 11728 1726882220.24239: done generating all_blocks data 11728 1726882220.24244: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11728 1726882220.24245: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11728 1726882220.24247: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml 11728 1726882220.24712: in VariableManager get_vars() 11728 1726882220.24737: done with get_vars() 11728 1726882220.24743: variable 'omit' from source: magic vars 11728 1726882220.24782: variable 'omit' from source: magic vars 11728 1726882220.25042: in VariableManager get_vars() 11728 1726882220.25057: done with get_vars() 11728 1726882220.25081: in VariableManager get_vars() 11728 1726882220.25102: done with get_vars() 11728 1726882220.25137: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11728 1726882220.25512: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11728 1726882220.25590: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11728 1726882220.26371: in VariableManager get_vars() 11728 1726882220.26595: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882220.31799: done processing included file 11728 1726882220.31802: iterating over new_blocks loaded from include file 11728 1726882220.31803: in VariableManager get_vars() 11728 1726882220.31839: done with get_vars() 11728 1726882220.31841: filtering new block on tags 11728 1726882220.32557: done filtering new block on tags 11728 1726882220.32561: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml for managed_node3 => (item=tasks/create_bond_profile_reconfigure.yml) 11728 1726882220.32567: extending task lists for all hosts with included blocks 11728 1726882220.36537: done extending task lists 11728 1726882220.36540: done processing included files 11728 1726882220.36540: results queue empty 11728 1726882220.36541: checking for any_errors_fatal 11728 1726882220.36545: done checking for any_errors_fatal 11728 1726882220.36545: checking for max_fail_percentage 11728 1726882220.36546: done checking for max_fail_percentage 11728 1726882220.36547: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.36548: done checking to see if all hosts have failed 11728 1726882220.36549: getting the remaining hosts for this loop 11728 1726882220.36551: done getting the remaining hosts for this loop 11728 1726882220.36553: getting the next task for host managed_node3 11728 1726882220.36558: done getting next task for host managed_node3 11728 1726882220.36561: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882220.36565: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.36577: getting variables 11728 1726882220.36578: in VariableManager get_vars() 11728 1726882220.36600: Calling all_inventory to load vars for managed_node3 11728 1726882220.36603: Calling groups_inventory to load vars for managed_node3 11728 1726882220.36605: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.36611: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.36614: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.36617: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.39328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.42527: done with get_vars() 11728 1726882220.42559: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:30:20 -0400 (0:00:00.286) 0:00:45.278 ****** 11728 1726882220.42647: entering _queue_task() for managed_node3/include_tasks 11728 1726882220.43426: worker is 1 (out of 1 available) 11728 1726882220.43441: exiting _queue_task() for managed_node3/include_tasks 11728 1726882220.43455: done queuing things up, now waiting for results queue to drain 11728 1726882220.43456: waiting for pending results... 11728 1726882220.44315: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882220.44373: in run() - task 12673a56-9f93-5c28-a762-000000000a2e 11728 1726882220.44626: variable 'ansible_search_path' from source: unknown 11728 1726882220.44630: variable 'ansible_search_path' from source: unknown 11728 1726882220.44632: calling self._execute() 11728 1726882220.44746: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.44758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.44772: variable 'omit' from source: magic vars 11728 1726882220.45546: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.45565: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.45609: _execute() done 11728 1726882220.45617: dumping result to json 11728 1726882220.45626: done dumping result, returning 11728 1726882220.45799: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-5c28-a762-000000000a2e] 11728 1726882220.45803: sending task result for task 12673a56-9f93-5c28-a762-000000000a2e 11728 1726882220.45877: done sending task result for task 12673a56-9f93-5c28-a762-000000000a2e 11728 1726882220.45880: WORKER PROCESS EXITING 11728 1726882220.45934: no more pending results, returning what we have 11728 1726882220.45940: in VariableManager get_vars() 11728 1726882220.45997: Calling all_inventory to load vars for managed_node3 11728 1726882220.46000: Calling groups_inventory to load vars for managed_node3 11728 1726882220.46003: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.46017: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.46021: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.46025: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.49107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.51858: done with get_vars() 11728 1726882220.51883: variable 'ansible_search_path' from source: unknown 11728 1726882220.51884: variable 'ansible_search_path' from source: unknown 11728 1726882220.52128: we have included files to process 11728 1726882220.52129: generating all_blocks data 11728 1726882220.52131: done generating all_blocks data 11728 1726882220.52133: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882220.52134: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882220.52136: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882220.53328: done processing included file 11728 1726882220.53330: iterating over new_blocks loaded from include file 11728 1726882220.53332: in VariableManager get_vars() 11728 1726882220.53362: done with get_vars() 11728 1726882220.53364: filtering new block on tags 11728 1726882220.53398: done filtering new block on tags 11728 1726882220.53401: in VariableManager get_vars() 11728 1726882220.53429: done with get_vars() 11728 1726882220.53430: filtering new block on tags 11728 1726882220.53477: done filtering new block on tags 11728 1726882220.53480: in VariableManager get_vars() 11728 1726882220.53712: done with get_vars() 11728 1726882220.53714: filtering new block on tags 11728 1726882220.53757: done filtering new block on tags 11728 1726882220.53760: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11728 1726882220.53765: extending task lists for all hosts with included blocks 11728 1726882220.57106: done extending task lists 11728 1726882220.57108: done processing included files 11728 1726882220.57109: results queue empty 11728 1726882220.57109: checking for any_errors_fatal 11728 1726882220.57113: done checking for any_errors_fatal 11728 1726882220.57114: checking for max_fail_percentage 11728 1726882220.57115: done checking for max_fail_percentage 11728 1726882220.57116: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.57117: done checking to see if all hosts have failed 11728 1726882220.57118: getting the remaining hosts for this loop 11728 1726882220.57119: done getting the remaining hosts for this loop 11728 1726882220.57122: getting the next task for host managed_node3 11728 1726882220.57127: done getting next task for host managed_node3 11728 1726882220.57130: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882220.57134: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.57145: getting variables 11728 1726882220.57146: in VariableManager get_vars() 11728 1726882220.57167: Calling all_inventory to load vars for managed_node3 11728 1726882220.57169: Calling groups_inventory to load vars for managed_node3 11728 1726882220.57171: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.57176: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.57179: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.57181: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.58384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.60284: done with get_vars() 11728 1726882220.60316: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:30:20 -0400 (0:00:00.177) 0:00:45.456 ****** 11728 1726882220.60398: entering _queue_task() for managed_node3/setup 11728 1726882220.61117: worker is 1 (out of 1 available) 11728 1726882220.61126: exiting _queue_task() for managed_node3/setup 11728 1726882220.61137: done queuing things up, now waiting for results queue to drain 11728 1726882220.61138: waiting for pending results... 11728 1726882220.61567: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882220.62001: in run() - task 12673a56-9f93-5c28-a762-000000000b10 11728 1726882220.62199: variable 'ansible_search_path' from source: unknown 11728 1726882220.62203: variable 'ansible_search_path' from source: unknown 11728 1726882220.62206: calling self._execute() 11728 1726882220.62279: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.62502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.62506: variable 'omit' from source: magic vars 11728 1726882220.63208: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.63228: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.63662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882220.68728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882220.68847: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882220.69031: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882220.69071: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882220.69302: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882220.69306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882220.69345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882220.69436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882220.69564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882220.69584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882220.69723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882220.69777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882220.69829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882220.70172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882220.70175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882220.70444: variable '__network_required_facts' from source: role '' defaults 11728 1726882220.70462: variable 'ansible_facts' from source: unknown 11728 1726882220.71514: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11728 1726882220.71524: when evaluation is False, skipping this task 11728 1726882220.71533: _execute() done 11728 1726882220.71544: dumping result to json 11728 1726882220.71552: done dumping result, returning 11728 1726882220.71564: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-5c28-a762-000000000b10] 11728 1726882220.71574: sending task result for task 12673a56-9f93-5c28-a762-000000000b10 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882220.71802: no more pending results, returning what we have 11728 1726882220.71807: results queue empty 11728 1726882220.71808: checking for any_errors_fatal 11728 1726882220.71810: done checking for any_errors_fatal 11728 1726882220.71811: checking for max_fail_percentage 11728 1726882220.71813: done checking for max_fail_percentage 11728 1726882220.71814: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.71815: done checking to see if all hosts have failed 11728 1726882220.71815: getting the remaining hosts for this loop 11728 1726882220.71818: done getting the remaining hosts for this loop 11728 1726882220.71821: getting the next task for host managed_node3 11728 1726882220.71833: done getting next task for host managed_node3 11728 1726882220.71836: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882220.71842: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.71980: getting variables 11728 1726882220.71982: in VariableManager get_vars() 11728 1726882220.72030: Calling all_inventory to load vars for managed_node3 11728 1726882220.72034: Calling groups_inventory to load vars for managed_node3 11728 1726882220.72036: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.72099: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.72103: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.72107: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.72708: done sending task result for task 12673a56-9f93-5c28-a762-000000000b10 11728 1726882220.72717: WORKER PROCESS EXITING 11728 1726882220.74848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.77461: done with get_vars() 11728 1726882220.77483: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:30:20 -0400 (0:00:00.171) 0:00:45.628 ****** 11728 1726882220.77597: entering _queue_task() for managed_node3/stat 11728 1726882220.77926: worker is 1 (out of 1 available) 11728 1726882220.78054: exiting _queue_task() for managed_node3/stat 11728 1726882220.78065: done queuing things up, now waiting for results queue to drain 11728 1726882220.78066: waiting for pending results... 11728 1726882220.78249: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882220.78434: in run() - task 12673a56-9f93-5c28-a762-000000000b12 11728 1726882220.78455: variable 'ansible_search_path' from source: unknown 11728 1726882220.78464: variable 'ansible_search_path' from source: unknown 11728 1726882220.78514: calling self._execute() 11728 1726882220.78645: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.78656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.78668: variable 'omit' from source: magic vars 11728 1726882220.79038: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.79062: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.79245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882220.79873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882220.79877: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882220.79879: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882220.79882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882220.80155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882220.80183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882220.80405: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882220.80409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882220.80458: variable '__network_is_ostree' from source: set_fact 11728 1726882220.80526: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882220.80536: when evaluation is False, skipping this task 11728 1726882220.80731: _execute() done 11728 1726882220.80734: dumping result to json 11728 1726882220.80738: done dumping result, returning 11728 1726882220.80742: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-5c28-a762-000000000b12] 11728 1726882220.80744: sending task result for task 12673a56-9f93-5c28-a762-000000000b12 11728 1726882220.80824: done sending task result for task 12673a56-9f93-5c28-a762-000000000b12 11728 1726882220.80827: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882220.80882: no more pending results, returning what we have 11728 1726882220.80887: results queue empty 11728 1726882220.80888: checking for any_errors_fatal 11728 1726882220.80900: done checking for any_errors_fatal 11728 1726882220.80901: checking for max_fail_percentage 11728 1726882220.80903: done checking for max_fail_percentage 11728 1726882220.80904: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.80905: done checking to see if all hosts have failed 11728 1726882220.80906: getting the remaining hosts for this loop 11728 1726882220.80908: done getting the remaining hosts for this loop 11728 1726882220.80911: getting the next task for host managed_node3 11728 1726882220.80919: done getting next task for host managed_node3 11728 1726882220.80922: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882220.80928: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.80958: getting variables 11728 1726882220.80959: in VariableManager get_vars() 11728 1726882220.81207: Calling all_inventory to load vars for managed_node3 11728 1726882220.81210: Calling groups_inventory to load vars for managed_node3 11728 1726882220.81213: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.81226: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.81230: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.81233: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.82948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.85297: done with get_vars() 11728 1726882220.85334: done getting variables 11728 1726882220.85399: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:30:20 -0400 (0:00:00.078) 0:00:45.706 ****** 11728 1726882220.85453: entering _queue_task() for managed_node3/set_fact 11728 1726882220.86109: worker is 1 (out of 1 available) 11728 1726882220.86123: exiting _queue_task() for managed_node3/set_fact 11728 1726882220.86182: done queuing things up, now waiting for results queue to drain 11728 1726882220.86184: waiting for pending results... 11728 1726882220.86550: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882220.86715: in run() - task 12673a56-9f93-5c28-a762-000000000b13 11728 1726882220.86719: variable 'ansible_search_path' from source: unknown 11728 1726882220.86722: variable 'ansible_search_path' from source: unknown 11728 1726882220.86725: calling self._execute() 11728 1726882220.86802: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.86808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.86812: variable 'omit' from source: magic vars 11728 1726882220.87273: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.87276: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.87447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882220.87795: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882220.87830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882220.87869: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882220.87926: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882220.88009: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882220.88099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882220.88104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882220.88107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882220.88187: variable '__network_is_ostree' from source: set_fact 11728 1726882220.88206: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882220.88210: when evaluation is False, skipping this task 11728 1726882220.88212: _execute() done 11728 1726882220.88215: dumping result to json 11728 1726882220.88222: done dumping result, returning 11728 1726882220.88243: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-5c28-a762-000000000b13] 11728 1726882220.88249: sending task result for task 12673a56-9f93-5c28-a762-000000000b13 11728 1726882220.88469: done sending task result for task 12673a56-9f93-5c28-a762-000000000b13 11728 1726882220.88472: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882220.88524: no more pending results, returning what we have 11728 1726882220.88527: results queue empty 11728 1726882220.88528: checking for any_errors_fatal 11728 1726882220.88532: done checking for any_errors_fatal 11728 1726882220.88532: checking for max_fail_percentage 11728 1726882220.88534: done checking for max_fail_percentage 11728 1726882220.88534: checking to see if all hosts have failed and the running result is not ok 11728 1726882220.88535: done checking to see if all hosts have failed 11728 1726882220.88536: getting the remaining hosts for this loop 11728 1726882220.88537: done getting the remaining hosts for this loop 11728 1726882220.88540: getting the next task for host managed_node3 11728 1726882220.88552: done getting next task for host managed_node3 11728 1726882220.88555: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882220.88560: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882220.88577: getting variables 11728 1726882220.88579: in VariableManager get_vars() 11728 1726882220.88624: Calling all_inventory to load vars for managed_node3 11728 1726882220.88627: Calling groups_inventory to load vars for managed_node3 11728 1726882220.88629: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882220.88637: Calling all_plugins_play to load vars for managed_node3 11728 1726882220.88639: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882220.88642: Calling groups_plugins_play to load vars for managed_node3 11728 1726882220.90869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882220.92584: done with get_vars() 11728 1726882220.92614: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:30:20 -0400 (0:00:00.072) 0:00:45.779 ****** 11728 1726882220.92726: entering _queue_task() for managed_node3/service_facts 11728 1726882220.93509: worker is 1 (out of 1 available) 11728 1726882220.93524: exiting _queue_task() for managed_node3/service_facts 11728 1726882220.93720: done queuing things up, now waiting for results queue to drain 11728 1726882220.93722: waiting for pending results... 11728 1726882220.94332: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882220.94644: in run() - task 12673a56-9f93-5c28-a762-000000000b15 11728 1726882220.94648: variable 'ansible_search_path' from source: unknown 11728 1726882220.94656: variable 'ansible_search_path' from source: unknown 11728 1726882220.94740: calling self._execute() 11728 1726882220.94970: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.94978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.95201: variable 'omit' from source: magic vars 11728 1726882220.96024: variable 'ansible_distribution_major_version' from source: facts 11728 1726882220.96261: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882220.96264: variable 'omit' from source: magic vars 11728 1726882220.96270: variable 'omit' from source: magic vars 11728 1726882220.96332: variable 'omit' from source: magic vars 11728 1726882220.96524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882220.96565: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882220.96601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882220.96804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882220.96812: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882220.96818: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882220.96821: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.96823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.97007: Set connection var ansible_connection to ssh 11728 1726882220.97239: Set connection var ansible_shell_executable to /bin/sh 11728 1726882220.97242: Set connection var ansible_timeout to 10 11728 1726882220.97244: Set connection var ansible_shell_type to sh 11728 1726882220.97247: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882220.97249: Set connection var ansible_pipelining to False 11728 1726882220.97251: variable 'ansible_shell_executable' from source: unknown 11728 1726882220.97253: variable 'ansible_connection' from source: unknown 11728 1726882220.97255: variable 'ansible_module_compression' from source: unknown 11728 1726882220.97257: variable 'ansible_shell_type' from source: unknown 11728 1726882220.97262: variable 'ansible_shell_executable' from source: unknown 11728 1726882220.97264: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882220.97266: variable 'ansible_pipelining' from source: unknown 11728 1726882220.97269: variable 'ansible_timeout' from source: unknown 11728 1726882220.97271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882220.97744: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882220.97799: variable 'omit' from source: magic vars 11728 1726882220.98012: starting attempt loop 11728 1726882220.98016: running the handler 11728 1726882220.98019: _low_level_execute_command(): starting 11728 1726882220.98022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882220.99416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882220.99462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882220.99521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882220.99686: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882220.99707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882220.99791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882221.01540: stdout chunk (state=3): >>>/root <<< 11728 1726882221.01579: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882221.01621: stderr chunk (state=3): >>><<< 11728 1726882221.01629: stdout chunk (state=3): >>><<< 11728 1726882221.01747: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882221.01750: _low_level_execute_command(): starting 11728 1726882221.01753: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110 `" && echo ansible-tmp-1726882221.0171638-14015-124655502944110="` echo /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110 `" ) && sleep 0' 11728 1726882221.03083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882221.03216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882221.03416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882221.03475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882221.03777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882221.05622: stdout chunk (state=3): >>>ansible-tmp-1726882221.0171638-14015-124655502944110=/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110 <<< 11728 1726882221.05776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882221.05787: stdout chunk (state=3): >>><<< 11728 1726882221.05987: stderr chunk (state=3): >>><<< 11728 1726882221.05991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882221.0171638-14015-124655502944110=/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882221.05998: variable 'ansible_module_compression' from source: unknown 11728 1726882221.06130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11728 1726882221.06182: variable 'ansible_facts' from source: unknown 11728 1726882221.06592: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py 11728 1726882221.06721: Sending initial data 11728 1726882221.06731: Sent initial data (162 bytes) 11728 1726882221.08131: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882221.08160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882221.08320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882221.08425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882221.08444: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882221.08466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882221.08539: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882221.10181: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882221.10185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py" <<< 11728 1726882221.10187: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjm1kti7w /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py <<< 11728 1726882221.10215: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjm1kti7w" to remote "/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py" <<< 11728 1726882221.11713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882221.11758: stderr chunk (state=3): >>><<< 11728 1726882221.11969: stdout chunk (state=3): >>><<< 11728 1726882221.11976: done transferring module to remote 11728 1726882221.11980: _low_level_execute_command(): starting 11728 1726882221.11996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/ /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py && sleep 0' 11728 1726882221.13392: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11728 1726882221.13584: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882221.13732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882221.13804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882221.15701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882221.15705: stdout chunk (state=3): >>><<< 11728 1726882221.15707: stderr chunk (state=3): >>><<< 11728 1726882221.15713: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882221.15716: _low_level_execute_command(): starting 11728 1726882221.15718: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/AnsiballZ_service_facts.py && sleep 0' 11728 1726882221.16604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882221.16608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882221.16612: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882221.16615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882221.16617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882221.16630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882221.16688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882221.16717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882221.16764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.64986: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11728 1726882222.65033: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 11728 1726882222.65056: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 11728 1726882222.65064: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11728 1726882222.66609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882222.66612: stdout chunk (state=3): >>><<< 11728 1726882222.66614: stderr chunk (state=3): >>><<< 11728 1726882222.66621: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882222.67311: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882222.67320: _low_level_execute_command(): starting 11728 1726882222.67326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882221.0171638-14015-124655502944110/ > /dev/null 2>&1 && sleep 0' 11728 1726882222.67897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882222.67915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882222.68000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.68004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882222.68006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882222.68008: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882222.68011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.68015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882222.68017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882222.68021: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882222.68024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882222.68026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.68028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882222.68030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882222.68032: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882222.68034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.68100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882222.68112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.68121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.68202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.69981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882222.70038: stderr chunk (state=3): >>><<< 11728 1726882222.70058: stdout chunk (state=3): >>><<< 11728 1726882222.70300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882222.70304: handler run complete 11728 1726882222.70306: variable 'ansible_facts' from source: unknown 11728 1726882222.70468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882222.71030: variable 'ansible_facts' from source: unknown 11728 1726882222.71187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882222.71500: attempt loop complete, returning result 11728 1726882222.71510: _execute() done 11728 1726882222.71522: dumping result to json 11728 1726882222.71530: done dumping result, returning 11728 1726882222.71544: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-5c28-a762-000000000b15] 11728 1726882222.71555: sending task result for task 12673a56-9f93-5c28-a762-000000000b15 11728 1726882222.72689: done sending task result for task 12673a56-9f93-5c28-a762-000000000b15 11728 1726882222.72696: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882222.72875: no more pending results, returning what we have 11728 1726882222.72878: results queue empty 11728 1726882222.72879: checking for any_errors_fatal 11728 1726882222.72884: done checking for any_errors_fatal 11728 1726882222.72885: checking for max_fail_percentage 11728 1726882222.72886: done checking for max_fail_percentage 11728 1726882222.72887: checking to see if all hosts have failed and the running result is not ok 11728 1726882222.72888: done checking to see if all hosts have failed 11728 1726882222.72888: getting the remaining hosts for this loop 11728 1726882222.72890: done getting the remaining hosts for this loop 11728 1726882222.72897: getting the next task for host managed_node3 11728 1726882222.72904: done getting next task for host managed_node3 11728 1726882222.72908: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882222.72914: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882222.72941: getting variables 11728 1726882222.72943: in VariableManager get_vars() 11728 1726882222.72979: Calling all_inventory to load vars for managed_node3 11728 1726882222.72982: Calling groups_inventory to load vars for managed_node3 11728 1726882222.72984: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882222.72997: Calling all_plugins_play to load vars for managed_node3 11728 1726882222.73001: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882222.73005: Calling groups_plugins_play to load vars for managed_node3 11728 1726882222.74532: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882222.76235: done with get_vars() 11728 1726882222.76261: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:30:22 -0400 (0:00:01.836) 0:00:47.616 ****** 11728 1726882222.76373: entering _queue_task() for managed_node3/package_facts 11728 1726882222.76761: worker is 1 (out of 1 available) 11728 1726882222.76773: exiting _queue_task() for managed_node3/package_facts 11728 1726882222.76785: done queuing things up, now waiting for results queue to drain 11728 1726882222.76786: waiting for pending results... 11728 1726882222.76979: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882222.77084: in run() - task 12673a56-9f93-5c28-a762-000000000b16 11728 1726882222.77101: variable 'ansible_search_path' from source: unknown 11728 1726882222.77105: variable 'ansible_search_path' from source: unknown 11728 1726882222.77137: calling self._execute() 11728 1726882222.77218: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882222.77224: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882222.77233: variable 'omit' from source: magic vars 11728 1726882222.77517: variable 'ansible_distribution_major_version' from source: facts 11728 1726882222.77524: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882222.77530: variable 'omit' from source: magic vars 11728 1726882222.77590: variable 'omit' from source: magic vars 11728 1726882222.77623: variable 'omit' from source: magic vars 11728 1726882222.77650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882222.77679: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882222.77697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882222.77713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882222.77728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882222.77751: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882222.77754: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882222.77757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882222.77825: Set connection var ansible_connection to ssh 11728 1726882222.77836: Set connection var ansible_shell_executable to /bin/sh 11728 1726882222.77841: Set connection var ansible_timeout to 10 11728 1726882222.77843: Set connection var ansible_shell_type to sh 11728 1726882222.77851: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882222.77853: Set connection var ansible_pipelining to False 11728 1726882222.77870: variable 'ansible_shell_executable' from source: unknown 11728 1726882222.77874: variable 'ansible_connection' from source: unknown 11728 1726882222.77878: variable 'ansible_module_compression' from source: unknown 11728 1726882222.77880: variable 'ansible_shell_type' from source: unknown 11728 1726882222.77883: variable 'ansible_shell_executable' from source: unknown 11728 1726882222.77885: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882222.77887: variable 'ansible_pipelining' from source: unknown 11728 1726882222.77890: variable 'ansible_timeout' from source: unknown 11728 1726882222.77895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882222.78039: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882222.78049: variable 'omit' from source: magic vars 11728 1726882222.78052: starting attempt loop 11728 1726882222.78055: running the handler 11728 1726882222.78069: _low_level_execute_command(): starting 11728 1726882222.78076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882222.78585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.78588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.78592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882222.78597: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.78648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882222.78655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.78657: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.78705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.80602: stdout chunk (state=3): >>>/root <<< 11728 1726882222.80606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882222.80609: stdout chunk (state=3): >>><<< 11728 1726882222.80611: stderr chunk (state=3): >>><<< 11728 1726882222.80615: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882222.80617: _low_level_execute_command(): starting 11728 1726882222.80620: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870 `" && echo ansible-tmp-1726882222.8053837-14078-38612544112870="` echo /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870 `" ) && sleep 0' 11728 1726882222.81530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882222.81546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882222.81561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.81578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882222.81600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882222.81622: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882222.81637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.81656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882222.81712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.81765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882222.81784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.81802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.81878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.83729: stdout chunk (state=3): >>>ansible-tmp-1726882222.8053837-14078-38612544112870=/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870 <<< 11728 1726882222.83873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882222.83888: stdout chunk (state=3): >>><<< 11728 1726882222.83904: stderr chunk (state=3): >>><<< 11728 1726882222.83923: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882222.8053837-14078-38612544112870=/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882222.83985: variable 'ansible_module_compression' from source: unknown 11728 1726882222.84046: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11728 1726882222.84119: variable 'ansible_facts' from source: unknown 11728 1726882222.84412: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py 11728 1726882222.84517: Sending initial data 11728 1726882222.84521: Sent initial data (161 bytes) 11728 1726882222.85070: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882222.85080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882222.85102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.85109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882222.85115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.85178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.85211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.85254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.86803: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882222.86842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882222.86885: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_idvj3fq /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py <<< 11728 1726882222.86891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py" <<< 11728 1726882222.86932: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp_idvj3fq" to remote "/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py" <<< 11728 1726882222.87986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882222.88027: stderr chunk (state=3): >>><<< 11728 1726882222.88030: stdout chunk (state=3): >>><<< 11728 1726882222.88045: done transferring module to remote 11728 1726882222.88054: _low_level_execute_command(): starting 11728 1726882222.88058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/ /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py && sleep 0' 11728 1726882222.88603: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.88607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.88609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.88616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.88678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882222.88684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.88712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.88753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882222.90485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882222.90525: stderr chunk (state=3): >>><<< 11728 1726882222.90528: stdout chunk (state=3): >>><<< 11728 1726882222.90556: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882222.90559: _low_level_execute_command(): starting 11728 1726882222.90562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/AnsiballZ_package_facts.py && sleep 0' 11728 1726882222.91130: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882222.91151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882222.91154: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.91157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882222.91159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882222.91206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882222.91245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882222.91313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882223.35630: stdout chunk (state=3): >>> <<< 11728 1726882223.35718: stdout chunk (state=3): >>>{"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": <<< 11728 1726882223.35730: stdout chunk (state=3): >>>[{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64"<<< 11728 1726882223.35863: stdout chunk (state=3): >>>, "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "py<<< 11728 1726882223.35876: stdout chunk (state=3): >>>thon3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", <<< 11728 1726882223.35882: stdout chunk (state=3): >>>"epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": <<< 11728 1726882223.35885: stdout chunk (state=3): >>>"libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], <<< 11728 1726882223.35963: stdout chunk (state=3): >>>"initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "t<<< 11728 1726882223.35968: stdout chunk (state=3): >>>pm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "s<<< 11728 1726882223.35971: stdout chunk (state=3): >>>udo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "versio<<< 11728 1726882223.35976: stdout chunk (state=3): >>>n": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "releas<<< 11728 1726882223.36184: stdout chunk (state=3): >>>e": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": <<< 11728 1726882223.36194: stdout chunk (state=3): >>>null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.24<<< 11728 1726882223.36233: stdout chunk (state=3): >>>1", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11728 1726882223.38295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882223.38298: stdout chunk (state=3): >>><<< 11728 1726882223.38301: stderr chunk (state=3): >>><<< 11728 1726882223.38372: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882223.55141: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882223.55145: _low_level_execute_command(): starting 11728 1726882223.55148: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882222.8053837-14078-38612544112870/ > /dev/null 2>&1 && sleep 0' 11728 1726882223.55764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882223.55802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882223.55817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882223.55912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882223.55937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882223.55961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882223.55980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882223.56128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882223.58098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882223.58105: stdout chunk (state=3): >>><<< 11728 1726882223.58113: stderr chunk (state=3): >>><<< 11728 1726882223.58130: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882223.58299: handler run complete 11728 1726882223.59014: variable 'ansible_facts' from source: unknown 11728 1726882223.59484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.61117: variable 'ansible_facts' from source: unknown 11728 1726882223.61602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.62277: attempt loop complete, returning result 11728 1726882223.62297: _execute() done 11728 1726882223.62305: dumping result to json 11728 1726882223.62512: done dumping result, returning 11728 1726882223.62520: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-5c28-a762-000000000b16] 11728 1726882223.62523: sending task result for task 12673a56-9f93-5c28-a762-000000000b16 11728 1726882223.71454: done sending task result for task 12673a56-9f93-5c28-a762-000000000b16 11728 1726882223.71458: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882223.71566: no more pending results, returning what we have 11728 1726882223.71569: results queue empty 11728 1726882223.71570: checking for any_errors_fatal 11728 1726882223.71574: done checking for any_errors_fatal 11728 1726882223.71575: checking for max_fail_percentage 11728 1726882223.71576: done checking for max_fail_percentage 11728 1726882223.71577: checking to see if all hosts have failed and the running result is not ok 11728 1726882223.71578: done checking to see if all hosts have failed 11728 1726882223.71578: getting the remaining hosts for this loop 11728 1726882223.71580: done getting the remaining hosts for this loop 11728 1726882223.71583: getting the next task for host managed_node3 11728 1726882223.71588: done getting next task for host managed_node3 11728 1726882223.71594: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882223.71600: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882223.71610: getting variables 11728 1726882223.71612: in VariableManager get_vars() 11728 1726882223.71639: Calling all_inventory to load vars for managed_node3 11728 1726882223.71641: Calling groups_inventory to load vars for managed_node3 11728 1726882223.71643: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882223.71650: Calling all_plugins_play to load vars for managed_node3 11728 1726882223.71652: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882223.71655: Calling groups_plugins_play to load vars for managed_node3 11728 1726882223.72441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.74095: done with get_vars() 11728 1726882223.74120: done getting variables 11728 1726882223.74172: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:30:23 -0400 (0:00:00.978) 0:00:48.594 ****** 11728 1726882223.74206: entering _queue_task() for managed_node3/debug 11728 1726882223.74666: worker is 1 (out of 1 available) 11728 1726882223.74677: exiting _queue_task() for managed_node3/debug 11728 1726882223.74688: done queuing things up, now waiting for results queue to drain 11728 1726882223.74689: waiting for pending results... 11728 1726882223.74928: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882223.75069: in run() - task 12673a56-9f93-5c28-a762-000000000a2f 11728 1726882223.75099: variable 'ansible_search_path' from source: unknown 11728 1726882223.75109: variable 'ansible_search_path' from source: unknown 11728 1726882223.75201: calling self._execute() 11728 1726882223.75267: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.75279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.75291: variable 'omit' from source: magic vars 11728 1726882223.75700: variable 'ansible_distribution_major_version' from source: facts 11728 1726882223.75719: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882223.75731: variable 'omit' from source: magic vars 11728 1726882223.75811: variable 'omit' from source: magic vars 11728 1726882223.75903: variable 'network_provider' from source: set_fact 11728 1726882223.75918: variable 'omit' from source: magic vars 11728 1726882223.75950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882223.75980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882223.76002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882223.76018: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882223.76030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882223.76052: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882223.76056: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.76059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.76133: Set connection var ansible_connection to ssh 11728 1726882223.76141: Set connection var ansible_shell_executable to /bin/sh 11728 1726882223.76146: Set connection var ansible_timeout to 10 11728 1726882223.76149: Set connection var ansible_shell_type to sh 11728 1726882223.76156: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882223.76161: Set connection var ansible_pipelining to False 11728 1726882223.76180: variable 'ansible_shell_executable' from source: unknown 11728 1726882223.76184: variable 'ansible_connection' from source: unknown 11728 1726882223.76187: variable 'ansible_module_compression' from source: unknown 11728 1726882223.76189: variable 'ansible_shell_type' from source: unknown 11728 1726882223.76191: variable 'ansible_shell_executable' from source: unknown 11728 1726882223.76198: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.76201: variable 'ansible_pipelining' from source: unknown 11728 1726882223.76204: variable 'ansible_timeout' from source: unknown 11728 1726882223.76207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.76307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882223.76319: variable 'omit' from source: magic vars 11728 1726882223.76322: starting attempt loop 11728 1726882223.76324: running the handler 11728 1726882223.76361: handler run complete 11728 1726882223.76372: attempt loop complete, returning result 11728 1726882223.76375: _execute() done 11728 1726882223.76378: dumping result to json 11728 1726882223.76380: done dumping result, returning 11728 1726882223.76388: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-5c28-a762-000000000a2f] 11728 1726882223.76398: sending task result for task 12673a56-9f93-5c28-a762-000000000a2f 11728 1726882223.76475: done sending task result for task 12673a56-9f93-5c28-a762-000000000a2f 11728 1726882223.76478: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11728 1726882223.76547: no more pending results, returning what we have 11728 1726882223.76551: results queue empty 11728 1726882223.76552: checking for any_errors_fatal 11728 1726882223.76563: done checking for any_errors_fatal 11728 1726882223.76563: checking for max_fail_percentage 11728 1726882223.76565: done checking for max_fail_percentage 11728 1726882223.76566: checking to see if all hosts have failed and the running result is not ok 11728 1726882223.76567: done checking to see if all hosts have failed 11728 1726882223.76567: getting the remaining hosts for this loop 11728 1726882223.76569: done getting the remaining hosts for this loop 11728 1726882223.76572: getting the next task for host managed_node3 11728 1726882223.76579: done getting next task for host managed_node3 11728 1726882223.76583: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882223.76590: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882223.76603: getting variables 11728 1726882223.76605: in VariableManager get_vars() 11728 1726882223.76643: Calling all_inventory to load vars for managed_node3 11728 1726882223.76646: Calling groups_inventory to load vars for managed_node3 11728 1726882223.76648: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882223.76656: Calling all_plugins_play to load vars for managed_node3 11728 1726882223.76659: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882223.76661: Calling groups_plugins_play to load vars for managed_node3 11728 1726882223.77451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.78347: done with get_vars() 11728 1726882223.78366: done getting variables 11728 1726882223.78411: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:30:23 -0400 (0:00:00.042) 0:00:48.636 ****** 11728 1726882223.78448: entering _queue_task() for managed_node3/fail 11728 1726882223.78740: worker is 1 (out of 1 available) 11728 1726882223.78752: exiting _queue_task() for managed_node3/fail 11728 1726882223.78765: done queuing things up, now waiting for results queue to drain 11728 1726882223.78766: waiting for pending results... 11728 1726882223.79214: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882223.79305: in run() - task 12673a56-9f93-5c28-a762-000000000a30 11728 1726882223.79327: variable 'ansible_search_path' from source: unknown 11728 1726882223.79335: variable 'ansible_search_path' from source: unknown 11728 1726882223.79382: calling self._execute() 11728 1726882223.79567: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.79572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.79575: variable 'omit' from source: magic vars 11728 1726882223.79938: variable 'ansible_distribution_major_version' from source: facts 11728 1726882223.79948: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882223.80037: variable 'network_state' from source: role '' defaults 11728 1726882223.80044: Evaluated conditional (network_state != {}): False 11728 1726882223.80047: when evaluation is False, skipping this task 11728 1726882223.80050: _execute() done 11728 1726882223.80053: dumping result to json 11728 1726882223.80055: done dumping result, returning 11728 1726882223.80063: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-5c28-a762-000000000a30] 11728 1726882223.80068: sending task result for task 12673a56-9f93-5c28-a762-000000000a30 11728 1726882223.80167: done sending task result for task 12673a56-9f93-5c28-a762-000000000a30 11728 1726882223.80170: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882223.80224: no more pending results, returning what we have 11728 1726882223.80229: results queue empty 11728 1726882223.80230: checking for any_errors_fatal 11728 1726882223.80236: done checking for any_errors_fatal 11728 1726882223.80237: checking for max_fail_percentage 11728 1726882223.80239: done checking for max_fail_percentage 11728 1726882223.80240: checking to see if all hosts have failed and the running result is not ok 11728 1726882223.80240: done checking to see if all hosts have failed 11728 1726882223.80241: getting the remaining hosts for this loop 11728 1726882223.80243: done getting the remaining hosts for this loop 11728 1726882223.80247: getting the next task for host managed_node3 11728 1726882223.80254: done getting next task for host managed_node3 11728 1726882223.80258: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882223.80263: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882223.80290: getting variables 11728 1726882223.80292: in VariableManager get_vars() 11728 1726882223.80336: Calling all_inventory to load vars for managed_node3 11728 1726882223.80339: Calling groups_inventory to load vars for managed_node3 11728 1726882223.80341: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882223.80349: Calling all_plugins_play to load vars for managed_node3 11728 1726882223.80351: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882223.80354: Calling groups_plugins_play to load vars for managed_node3 11728 1726882223.81276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.82206: done with get_vars() 11728 1726882223.82221: done getting variables 11728 1726882223.82280: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:30:23 -0400 (0:00:00.038) 0:00:48.675 ****** 11728 1726882223.82311: entering _queue_task() for managed_node3/fail 11728 1726882223.82622: worker is 1 (out of 1 available) 11728 1726882223.82634: exiting _queue_task() for managed_node3/fail 11728 1726882223.82645: done queuing things up, now waiting for results queue to drain 11728 1726882223.82646: waiting for pending results... 11728 1726882223.82959: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882223.83063: in run() - task 12673a56-9f93-5c28-a762-000000000a31 11728 1726882223.83201: variable 'ansible_search_path' from source: unknown 11728 1726882223.83204: variable 'ansible_search_path' from source: unknown 11728 1726882223.83207: calling self._execute() 11728 1726882223.83221: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.83233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.83246: variable 'omit' from source: magic vars 11728 1726882223.83952: variable 'ansible_distribution_major_version' from source: facts 11728 1726882223.83968: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882223.84451: variable 'network_state' from source: role '' defaults 11728 1726882223.84471: Evaluated conditional (network_state != {}): False 11728 1726882223.84492: when evaluation is False, skipping this task 11728 1726882223.84504: _execute() done 11728 1726882223.84511: dumping result to json 11728 1726882223.84597: done dumping result, returning 11728 1726882223.84601: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-5c28-a762-000000000a31] 11728 1726882223.84604: sending task result for task 12673a56-9f93-5c28-a762-000000000a31 11728 1726882223.84792: done sending task result for task 12673a56-9f93-5c28-a762-000000000a31 11728 1726882223.84798: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882223.84855: no more pending results, returning what we have 11728 1726882223.84860: results queue empty 11728 1726882223.84861: checking for any_errors_fatal 11728 1726882223.84869: done checking for any_errors_fatal 11728 1726882223.84870: checking for max_fail_percentage 11728 1726882223.84872: done checking for max_fail_percentage 11728 1726882223.84873: checking to see if all hosts have failed and the running result is not ok 11728 1726882223.84874: done checking to see if all hosts have failed 11728 1726882223.84874: getting the remaining hosts for this loop 11728 1726882223.84877: done getting the remaining hosts for this loop 11728 1726882223.84880: getting the next task for host managed_node3 11728 1726882223.84888: done getting next task for host managed_node3 11728 1726882223.84895: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882223.84902: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882223.84927: getting variables 11728 1726882223.84929: in VariableManager get_vars() 11728 1726882223.84973: Calling all_inventory to load vars for managed_node3 11728 1726882223.84976: Calling groups_inventory to load vars for managed_node3 11728 1726882223.84978: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882223.85234: Calling all_plugins_play to load vars for managed_node3 11728 1726882223.85238: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882223.85242: Calling groups_plugins_play to load vars for managed_node3 11728 1726882223.86728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882223.90335: done with get_vars() 11728 1726882223.90365: done getting variables 11728 1726882223.90430: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:30:23 -0400 (0:00:00.081) 0:00:48.757 ****** 11728 1726882223.90468: entering _queue_task() for managed_node3/fail 11728 1726882223.90800: worker is 1 (out of 1 available) 11728 1726882223.90810: exiting _queue_task() for managed_node3/fail 11728 1726882223.90822: done queuing things up, now waiting for results queue to drain 11728 1726882223.90823: waiting for pending results... 11728 1726882223.91116: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882223.91277: in run() - task 12673a56-9f93-5c28-a762-000000000a32 11728 1726882223.91302: variable 'ansible_search_path' from source: unknown 11728 1726882223.91314: variable 'ansible_search_path' from source: unknown 11728 1726882223.91353: calling self._execute() 11728 1726882223.91452: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882223.91465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882223.91530: variable 'omit' from source: magic vars 11728 1726882223.91881: variable 'ansible_distribution_major_version' from source: facts 11728 1726882223.91902: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882223.92088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882223.94712: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882223.94778: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882223.95001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882223.95005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882223.95008: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882223.95011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882223.95014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882223.95042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882223.95085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882223.95111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882223.95213: variable 'ansible_distribution_major_version' from source: facts 11728 1726882223.95240: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11728 1726882223.95347: variable 'ansible_distribution' from source: facts 11728 1726882223.95355: variable '__network_rh_distros' from source: role '' defaults 11728 1726882223.95367: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11728 1726882223.95614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882223.95671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882223.95707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882223.95754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882223.95779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882223.95837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882223.95866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882223.95905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882223.95949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882223.95969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882223.96024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882223.96053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882223.96103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882223.96135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882223.96155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882223.96538: variable 'network_connections' from source: task vars 11728 1726882223.96541: variable 'controller_profile' from source: play vars 11728 1726882223.96578: variable 'controller_profile' from source: play vars 11728 1726882223.96599: variable 'controller_device' from source: play vars 11728 1726882223.96667: variable 'controller_device' from source: play vars 11728 1726882223.96683: variable 'dhcp_interface1' from source: play vars 11728 1726882223.96755: variable 'dhcp_interface1' from source: play vars 11728 1726882223.96770: variable 'port1_profile' from source: play vars 11728 1726882223.96865: variable 'port1_profile' from source: play vars 11728 1726882223.96869: variable 'dhcp_interface1' from source: play vars 11728 1726882223.96919: variable 'dhcp_interface1' from source: play vars 11728 1726882223.96931: variable 'controller_profile' from source: play vars 11728 1726882223.97002: variable 'controller_profile' from source: play vars 11728 1726882223.97082: variable 'port2_profile' from source: play vars 11728 1726882223.97085: variable 'port2_profile' from source: play vars 11728 1726882223.97087: variable 'dhcp_interface2' from source: play vars 11728 1726882223.97135: variable 'dhcp_interface2' from source: play vars 11728 1726882223.97156: variable 'controller_profile' from source: play vars 11728 1726882223.97225: variable 'controller_profile' from source: play vars 11728 1726882223.97237: variable 'network_state' from source: role '' defaults 11728 1726882223.97305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882223.97471: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882223.97520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882223.97554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882223.97589: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882223.97646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882223.97700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882223.97703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882223.97735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882223.97777: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11728 1726882223.97785: when evaluation is False, skipping this task 11728 1726882223.97792: _execute() done 11728 1726882223.97843: dumping result to json 11728 1726882223.97846: done dumping result, returning 11728 1726882223.97849: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-5c28-a762-000000000a32] 11728 1726882223.97851: sending task result for task 12673a56-9f93-5c28-a762-000000000a32 skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11728 1726882223.97997: no more pending results, returning what we have 11728 1726882223.98001: results queue empty 11728 1726882223.98002: checking for any_errors_fatal 11728 1726882223.98007: done checking for any_errors_fatal 11728 1726882223.98008: checking for max_fail_percentage 11728 1726882223.98010: done checking for max_fail_percentage 11728 1726882223.98011: checking to see if all hosts have failed and the running result is not ok 11728 1726882223.98012: done checking to see if all hosts have failed 11728 1726882223.98012: getting the remaining hosts for this loop 11728 1726882223.98014: done getting the remaining hosts for this loop 11728 1726882223.98017: getting the next task for host managed_node3 11728 1726882223.98026: done getting next task for host managed_node3 11728 1726882223.98030: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882223.98035: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882223.98055: getting variables 11728 1726882223.98056: in VariableManager get_vars() 11728 1726882223.98103: Calling all_inventory to load vars for managed_node3 11728 1726882223.98106: Calling groups_inventory to load vars for managed_node3 11728 1726882223.98109: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882223.98120: Calling all_plugins_play to load vars for managed_node3 11728 1726882223.98123: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882223.98127: Calling groups_plugins_play to load vars for managed_node3 11728 1726882223.98912: done sending task result for task 12673a56-9f93-5c28-a762-000000000a32 11728 1726882223.98915: WORKER PROCESS EXITING 11728 1726882223.99983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.01549: done with get_vars() 11728 1726882224.01579: done getting variables 11728 1726882224.01643: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:30:24 -0400 (0:00:00.112) 0:00:48.869 ****** 11728 1726882224.01679: entering _queue_task() for managed_node3/dnf 11728 1726882224.02132: worker is 1 (out of 1 available) 11728 1726882224.02144: exiting _queue_task() for managed_node3/dnf 11728 1726882224.02154: done queuing things up, now waiting for results queue to drain 11728 1726882224.02155: waiting for pending results... 11728 1726882224.02369: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882224.02546: in run() - task 12673a56-9f93-5c28-a762-000000000a33 11728 1726882224.02566: variable 'ansible_search_path' from source: unknown 11728 1726882224.02573: variable 'ansible_search_path' from source: unknown 11728 1726882224.02622: calling self._execute() 11728 1726882224.02720: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.02731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.02744: variable 'omit' from source: magic vars 11728 1726882224.03122: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.03142: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.03350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.06015: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.06099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.06144: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.06273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.06277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.06354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.06418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.06452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.06512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.06535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.06658: variable 'ansible_distribution' from source: facts 11728 1726882224.06667: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.06704: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11728 1726882224.06912: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.06968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.07039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.07068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.07101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.07125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.07157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.07174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.07190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.07218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.07233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.07268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.07284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.07303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.07328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.07340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.07446: variable 'network_connections' from source: task vars 11728 1726882224.07455: variable 'controller_profile' from source: play vars 11728 1726882224.07505: variable 'controller_profile' from source: play vars 11728 1726882224.07513: variable 'controller_device' from source: play vars 11728 1726882224.07555: variable 'controller_device' from source: play vars 11728 1726882224.07563: variable 'dhcp_interface1' from source: play vars 11728 1726882224.07610: variable 'dhcp_interface1' from source: play vars 11728 1726882224.07618: variable 'port1_profile' from source: play vars 11728 1726882224.07662: variable 'port1_profile' from source: play vars 11728 1726882224.07666: variable 'dhcp_interface1' from source: play vars 11728 1726882224.07712: variable 'dhcp_interface1' from source: play vars 11728 1726882224.07718: variable 'controller_profile' from source: play vars 11728 1726882224.07759: variable 'controller_profile' from source: play vars 11728 1726882224.07766: variable 'port2_profile' from source: play vars 11728 1726882224.07812: variable 'port2_profile' from source: play vars 11728 1726882224.07818: variable 'dhcp_interface2' from source: play vars 11728 1726882224.07859: variable 'dhcp_interface2' from source: play vars 11728 1726882224.07870: variable 'controller_profile' from source: play vars 11728 1726882224.07913: variable 'controller_profile' from source: play vars 11728 1726882224.07959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882224.08077: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882224.08105: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882224.08132: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882224.08152: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882224.08191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882224.08212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882224.08233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.08250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882224.08299: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882224.08449: variable 'network_connections' from source: task vars 11728 1726882224.08452: variable 'controller_profile' from source: play vars 11728 1726882224.08492: variable 'controller_profile' from source: play vars 11728 1726882224.08499: variable 'controller_device' from source: play vars 11728 1726882224.08540: variable 'controller_device' from source: play vars 11728 1726882224.08548: variable 'dhcp_interface1' from source: play vars 11728 1726882224.08591: variable 'dhcp_interface1' from source: play vars 11728 1726882224.08599: variable 'port1_profile' from source: play vars 11728 1726882224.08640: variable 'port1_profile' from source: play vars 11728 1726882224.08644: variable 'dhcp_interface1' from source: play vars 11728 1726882224.08688: variable 'dhcp_interface1' from source: play vars 11728 1726882224.08697: variable 'controller_profile' from source: play vars 11728 1726882224.08755: variable 'controller_profile' from source: play vars 11728 1726882224.08758: variable 'port2_profile' from source: play vars 11728 1726882224.08809: variable 'port2_profile' from source: play vars 11728 1726882224.08812: variable 'dhcp_interface2' from source: play vars 11728 1726882224.08998: variable 'dhcp_interface2' from source: play vars 11728 1726882224.09001: variable 'controller_profile' from source: play vars 11728 1726882224.09004: variable 'controller_profile' from source: play vars 11728 1726882224.09006: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882224.09008: when evaluation is False, skipping this task 11728 1726882224.09010: _execute() done 11728 1726882224.09012: dumping result to json 11728 1726882224.09014: done dumping result, returning 11728 1726882224.09016: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000a33] 11728 1726882224.09018: sending task result for task 12673a56-9f93-5c28-a762-000000000a33 11728 1726882224.09106: done sending task result for task 12673a56-9f93-5c28-a762-000000000a33 11728 1726882224.09109: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882224.09340: no more pending results, returning what we have 11728 1726882224.09343: results queue empty 11728 1726882224.09344: checking for any_errors_fatal 11728 1726882224.09350: done checking for any_errors_fatal 11728 1726882224.09351: checking for max_fail_percentage 11728 1726882224.09353: done checking for max_fail_percentage 11728 1726882224.09353: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.09354: done checking to see if all hosts have failed 11728 1726882224.09355: getting the remaining hosts for this loop 11728 1726882224.09356: done getting the remaining hosts for this loop 11728 1726882224.09359: getting the next task for host managed_node3 11728 1726882224.09365: done getting next task for host managed_node3 11728 1726882224.09369: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882224.09373: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.09390: getting variables 11728 1726882224.09391: in VariableManager get_vars() 11728 1726882224.09433: Calling all_inventory to load vars for managed_node3 11728 1726882224.09435: Calling groups_inventory to load vars for managed_node3 11728 1726882224.09437: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.09445: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.09448: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.09450: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.10465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.11342: done with get_vars() 11728 1726882224.11358: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882224.11414: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:30:24 -0400 (0:00:00.097) 0:00:48.966 ****** 11728 1726882224.11438: entering _queue_task() for managed_node3/yum 11728 1726882224.11660: worker is 1 (out of 1 available) 11728 1726882224.11675: exiting _queue_task() for managed_node3/yum 11728 1726882224.11688: done queuing things up, now waiting for results queue to drain 11728 1726882224.11690: waiting for pending results... 11728 1726882224.11874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882224.12002: in run() - task 12673a56-9f93-5c28-a762-000000000a34 11728 1726882224.12006: variable 'ansible_search_path' from source: unknown 11728 1726882224.12009: variable 'ansible_search_path' from source: unknown 11728 1726882224.12042: calling self._execute() 11728 1726882224.12153: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.12182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.12199: variable 'omit' from source: magic vars 11728 1726882224.12572: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.12670: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.12782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.14729: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.14773: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.14803: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.14829: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.14848: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.14911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.14955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.15011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.15016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.15041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.15300: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.15304: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11728 1726882224.15307: when evaluation is False, skipping this task 11728 1726882224.15310: _execute() done 11728 1726882224.15312: dumping result to json 11728 1726882224.15314: done dumping result, returning 11728 1726882224.15317: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000a34] 11728 1726882224.15319: sending task result for task 12673a56-9f93-5c28-a762-000000000a34 11728 1726882224.15449: done sending task result for task 12673a56-9f93-5c28-a762-000000000a34 11728 1726882224.15452: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11728 1726882224.15749: no more pending results, returning what we have 11728 1726882224.15753: results queue empty 11728 1726882224.15754: checking for any_errors_fatal 11728 1726882224.15760: done checking for any_errors_fatal 11728 1726882224.15760: checking for max_fail_percentage 11728 1726882224.15762: done checking for max_fail_percentage 11728 1726882224.15763: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.15764: done checking to see if all hosts have failed 11728 1726882224.15765: getting the remaining hosts for this loop 11728 1726882224.15767: done getting the remaining hosts for this loop 11728 1726882224.15770: getting the next task for host managed_node3 11728 1726882224.15778: done getting next task for host managed_node3 11728 1726882224.15783: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882224.15788: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.15818: getting variables 11728 1726882224.15820: in VariableManager get_vars() 11728 1726882224.15863: Calling all_inventory to load vars for managed_node3 11728 1726882224.15866: Calling groups_inventory to load vars for managed_node3 11728 1726882224.15869: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.15880: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.15882: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.15885: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.17577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.18530: done with get_vars() 11728 1726882224.18548: done getting variables 11728 1726882224.18603: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:30:24 -0400 (0:00:00.071) 0:00:49.038 ****** 11728 1726882224.18638: entering _queue_task() for managed_node3/fail 11728 1726882224.18905: worker is 1 (out of 1 available) 11728 1726882224.18921: exiting _queue_task() for managed_node3/fail 11728 1726882224.18934: done queuing things up, now waiting for results queue to drain 11728 1726882224.18935: waiting for pending results... 11728 1726882224.19412: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882224.19418: in run() - task 12673a56-9f93-5c28-a762-000000000a35 11728 1726882224.19421: variable 'ansible_search_path' from source: unknown 11728 1726882224.19424: variable 'ansible_search_path' from source: unknown 11728 1726882224.19427: calling self._execute() 11728 1726882224.19781: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.19785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.19788: variable 'omit' from source: magic vars 11728 1726882224.20167: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.20218: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.20372: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.20720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.22281: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.22359: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.22397: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.22441: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.22467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.22554: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.22807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.22811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.22849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.22872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.22928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.22957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.22996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.23065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.23074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.23183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.23187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.23189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.23225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.23244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.23436: variable 'network_connections' from source: task vars 11728 1726882224.23453: variable 'controller_profile' from source: play vars 11728 1726882224.23536: variable 'controller_profile' from source: play vars 11728 1726882224.23552: variable 'controller_device' from source: play vars 11728 1726882224.23625: variable 'controller_device' from source: play vars 11728 1726882224.23641: variable 'dhcp_interface1' from source: play vars 11728 1726882224.23709: variable 'dhcp_interface1' from source: play vars 11728 1726882224.23726: variable 'port1_profile' from source: play vars 11728 1726882224.23791: variable 'port1_profile' from source: play vars 11728 1726882224.23835: variable 'dhcp_interface1' from source: play vars 11728 1726882224.23877: variable 'dhcp_interface1' from source: play vars 11728 1726882224.23888: variable 'controller_profile' from source: play vars 11728 1726882224.23951: variable 'controller_profile' from source: play vars 11728 1726882224.23963: variable 'port2_profile' from source: play vars 11728 1726882224.24053: variable 'port2_profile' from source: play vars 11728 1726882224.24056: variable 'dhcp_interface2' from source: play vars 11728 1726882224.24096: variable 'dhcp_interface2' from source: play vars 11728 1726882224.24110: variable 'controller_profile' from source: play vars 11728 1726882224.24169: variable 'controller_profile' from source: play vars 11728 1726882224.24272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882224.24412: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882224.24457: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882224.24498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882224.24533: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882224.24601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882224.24605: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882224.24635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.24672: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882224.24800: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882224.24986: variable 'network_connections' from source: task vars 11728 1726882224.25005: variable 'controller_profile' from source: play vars 11728 1726882224.25075: variable 'controller_profile' from source: play vars 11728 1726882224.25087: variable 'controller_device' from source: play vars 11728 1726882224.25157: variable 'controller_device' from source: play vars 11728 1726882224.25171: variable 'dhcp_interface1' from source: play vars 11728 1726882224.25236: variable 'dhcp_interface1' from source: play vars 11728 1726882224.25255: variable 'port1_profile' from source: play vars 11728 1726882224.25323: variable 'port1_profile' from source: play vars 11728 1726882224.25335: variable 'dhcp_interface1' from source: play vars 11728 1726882224.25470: variable 'dhcp_interface1' from source: play vars 11728 1726882224.25473: variable 'controller_profile' from source: play vars 11728 1726882224.25475: variable 'controller_profile' from source: play vars 11728 1726882224.25486: variable 'port2_profile' from source: play vars 11728 1726882224.25548: variable 'port2_profile' from source: play vars 11728 1726882224.25560: variable 'dhcp_interface2' from source: play vars 11728 1726882224.25629: variable 'dhcp_interface2' from source: play vars 11728 1726882224.25640: variable 'controller_profile' from source: play vars 11728 1726882224.25708: variable 'controller_profile' from source: play vars 11728 1726882224.25745: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882224.25754: when evaluation is False, skipping this task 11728 1726882224.25762: _execute() done 11728 1726882224.25768: dumping result to json 11728 1726882224.25775: done dumping result, returning 11728 1726882224.25789: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000a35] 11728 1726882224.26002: sending task result for task 12673a56-9f93-5c28-a762-000000000a35 11728 1726882224.26077: done sending task result for task 12673a56-9f93-5c28-a762-000000000a35 11728 1726882224.26080: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882224.26136: no more pending results, returning what we have 11728 1726882224.26140: results queue empty 11728 1726882224.26141: checking for any_errors_fatal 11728 1726882224.26149: done checking for any_errors_fatal 11728 1726882224.26150: checking for max_fail_percentage 11728 1726882224.26151: done checking for max_fail_percentage 11728 1726882224.26152: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.26153: done checking to see if all hosts have failed 11728 1726882224.26154: getting the remaining hosts for this loop 11728 1726882224.26155: done getting the remaining hosts for this loop 11728 1726882224.26159: getting the next task for host managed_node3 11728 1726882224.26166: done getting next task for host managed_node3 11728 1726882224.26170: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11728 1726882224.26176: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.26201: getting variables 11728 1726882224.26202: in VariableManager get_vars() 11728 1726882224.26245: Calling all_inventory to load vars for managed_node3 11728 1726882224.26248: Calling groups_inventory to load vars for managed_node3 11728 1726882224.26250: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.26260: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.26263: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.26265: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.27803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.29505: done with get_vars() 11728 1726882224.29529: done getting variables 11728 1726882224.29590: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:30:24 -0400 (0:00:00.109) 0:00:49.148 ****** 11728 1726882224.29633: entering _queue_task() for managed_node3/package 11728 1726882224.29983: worker is 1 (out of 1 available) 11728 1726882224.29999: exiting _queue_task() for managed_node3/package 11728 1726882224.30012: done queuing things up, now waiting for results queue to drain 11728 1726882224.30013: waiting for pending results... 11728 1726882224.30317: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11728 1726882224.30478: in run() - task 12673a56-9f93-5c28-a762-000000000a36 11728 1726882224.30500: variable 'ansible_search_path' from source: unknown 11728 1726882224.30507: variable 'ansible_search_path' from source: unknown 11728 1726882224.30549: calling self._execute() 11728 1726882224.30646: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.30660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.30674: variable 'omit' from source: magic vars 11728 1726882224.31059: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.31084: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.31276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882224.31558: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882224.31801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882224.31804: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882224.31807: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882224.31848: variable 'network_packages' from source: role '' defaults 11728 1726882224.31957: variable '__network_provider_setup' from source: role '' defaults 11728 1726882224.31976: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882224.32044: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882224.32059: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882224.32128: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882224.32314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.33662: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.33706: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.33735: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.33760: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.33788: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.33858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.33877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.33899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.33924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.33940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.33968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.33988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.34198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.34202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.34205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.34262: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882224.34348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.34365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.34382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.34410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.34424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.34482: variable 'ansible_python' from source: facts 11728 1726882224.34499: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882224.34553: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882224.34608: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882224.34688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.34708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.34726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.34754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.34764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.34800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.34818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.34835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.34863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.34873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.34967: variable 'network_connections' from source: task vars 11728 1726882224.34977: variable 'controller_profile' from source: play vars 11728 1726882224.35203: variable 'controller_profile' from source: play vars 11728 1726882224.35206: variable 'controller_device' from source: play vars 11728 1726882224.35208: variable 'controller_device' from source: play vars 11728 1726882224.35210: variable 'dhcp_interface1' from source: play vars 11728 1726882224.35273: variable 'dhcp_interface1' from source: play vars 11728 1726882224.35286: variable 'port1_profile' from source: play vars 11728 1726882224.35392: variable 'port1_profile' from source: play vars 11728 1726882224.35414: variable 'dhcp_interface1' from source: play vars 11728 1726882224.35523: variable 'dhcp_interface1' from source: play vars 11728 1726882224.35545: variable 'controller_profile' from source: play vars 11728 1726882224.35657: variable 'controller_profile' from source: play vars 11728 1726882224.35665: variable 'port2_profile' from source: play vars 11728 1726882224.35738: variable 'port2_profile' from source: play vars 11728 1726882224.35751: variable 'dhcp_interface2' from source: play vars 11728 1726882224.35819: variable 'dhcp_interface2' from source: play vars 11728 1726882224.35827: variable 'controller_profile' from source: play vars 11728 1726882224.35898: variable 'controller_profile' from source: play vars 11728 1726882224.35951: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882224.35973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882224.36000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.36022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882224.36062: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.36246: variable 'network_connections' from source: task vars 11728 1726882224.36249: variable 'controller_profile' from source: play vars 11728 1726882224.36323: variable 'controller_profile' from source: play vars 11728 1726882224.36331: variable 'controller_device' from source: play vars 11728 1726882224.36400: variable 'controller_device' from source: play vars 11728 1726882224.36412: variable 'dhcp_interface1' from source: play vars 11728 1726882224.36458: variable 'dhcp_interface1' from source: play vars 11728 1726882224.36467: variable 'port1_profile' from source: play vars 11728 1726882224.36537: variable 'port1_profile' from source: play vars 11728 1726882224.36545: variable 'dhcp_interface1' from source: play vars 11728 1726882224.36612: variable 'dhcp_interface1' from source: play vars 11728 1726882224.36621: variable 'controller_profile' from source: play vars 11728 1726882224.36687: variable 'controller_profile' from source: play vars 11728 1726882224.36698: variable 'port2_profile' from source: play vars 11728 1726882224.36769: variable 'port2_profile' from source: play vars 11728 1726882224.37001: variable 'dhcp_interface2' from source: play vars 11728 1726882224.37005: variable 'dhcp_interface2' from source: play vars 11728 1726882224.37007: variable 'controller_profile' from source: play vars 11728 1726882224.37009: variable 'controller_profile' from source: play vars 11728 1726882224.37066: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882224.37151: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.37445: variable 'network_connections' from source: task vars 11728 1726882224.37454: variable 'controller_profile' from source: play vars 11728 1726882224.37522: variable 'controller_profile' from source: play vars 11728 1726882224.37532: variable 'controller_device' from source: play vars 11728 1726882224.37623: variable 'controller_device' from source: play vars 11728 1726882224.37638: variable 'dhcp_interface1' from source: play vars 11728 1726882224.37743: variable 'dhcp_interface1' from source: play vars 11728 1726882224.37754: variable 'port1_profile' from source: play vars 11728 1726882224.37838: variable 'port1_profile' from source: play vars 11728 1726882224.37860: variable 'dhcp_interface1' from source: play vars 11728 1726882224.37935: variable 'dhcp_interface1' from source: play vars 11728 1726882224.37941: variable 'controller_profile' from source: play vars 11728 1726882224.37991: variable 'controller_profile' from source: play vars 11728 1726882224.38002: variable 'port2_profile' from source: play vars 11728 1726882224.38051: variable 'port2_profile' from source: play vars 11728 1726882224.38057: variable 'dhcp_interface2' from source: play vars 11728 1726882224.38108: variable 'dhcp_interface2' from source: play vars 11728 1726882224.38113: variable 'controller_profile' from source: play vars 11728 1726882224.38157: variable 'controller_profile' from source: play vars 11728 1726882224.38177: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882224.38236: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882224.38432: variable 'network_connections' from source: task vars 11728 1726882224.38435: variable 'controller_profile' from source: play vars 11728 1726882224.38479: variable 'controller_profile' from source: play vars 11728 1726882224.38485: variable 'controller_device' from source: play vars 11728 1726882224.38535: variable 'controller_device' from source: play vars 11728 1726882224.38542: variable 'dhcp_interface1' from source: play vars 11728 1726882224.38588: variable 'dhcp_interface1' from source: play vars 11728 1726882224.38596: variable 'port1_profile' from source: play vars 11728 1726882224.38643: variable 'port1_profile' from source: play vars 11728 1726882224.38649: variable 'dhcp_interface1' from source: play vars 11728 1726882224.38695: variable 'dhcp_interface1' from source: play vars 11728 1726882224.38703: variable 'controller_profile' from source: play vars 11728 1726882224.38749: variable 'controller_profile' from source: play vars 11728 1726882224.38752: variable 'port2_profile' from source: play vars 11728 1726882224.38797: variable 'port2_profile' from source: play vars 11728 1726882224.38805: variable 'dhcp_interface2' from source: play vars 11728 1726882224.38851: variable 'dhcp_interface2' from source: play vars 11728 1726882224.38854: variable 'controller_profile' from source: play vars 11728 1726882224.38903: variable 'controller_profile' from source: play vars 11728 1726882224.38949: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882224.38992: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882224.39002: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882224.39044: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882224.39179: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882224.39503: variable 'network_connections' from source: task vars 11728 1726882224.39506: variable 'controller_profile' from source: play vars 11728 1726882224.39550: variable 'controller_profile' from source: play vars 11728 1726882224.39556: variable 'controller_device' from source: play vars 11728 1726882224.39598: variable 'controller_device' from source: play vars 11728 1726882224.39607: variable 'dhcp_interface1' from source: play vars 11728 1726882224.39651: variable 'dhcp_interface1' from source: play vars 11728 1726882224.39657: variable 'port1_profile' from source: play vars 11728 1726882224.39777: variable 'port1_profile' from source: play vars 11728 1726882224.39780: variable 'dhcp_interface1' from source: play vars 11728 1726882224.39782: variable 'dhcp_interface1' from source: play vars 11728 1726882224.39812: variable 'controller_profile' from source: play vars 11728 1726882224.39902: variable 'controller_profile' from source: play vars 11728 1726882224.39915: variable 'port2_profile' from source: play vars 11728 1726882224.39983: variable 'port2_profile' from source: play vars 11728 1726882224.40108: variable 'dhcp_interface2' from source: play vars 11728 1726882224.40153: variable 'dhcp_interface2' from source: play vars 11728 1726882224.40164: variable 'controller_profile' from source: play vars 11728 1726882224.40224: variable 'controller_profile' from source: play vars 11728 1726882224.40237: variable 'ansible_distribution' from source: facts 11728 1726882224.40245: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.40300: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.40303: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882224.40512: variable 'ansible_distribution' from source: facts 11728 1726882224.40515: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.40517: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.40518: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882224.40650: variable 'ansible_distribution' from source: facts 11728 1726882224.40658: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.40668: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.40711: variable 'network_provider' from source: set_fact 11728 1726882224.40730: variable 'ansible_facts' from source: unknown 11728 1726882224.41443: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11728 1726882224.41452: when evaluation is False, skipping this task 11728 1726882224.41459: _execute() done 11728 1726882224.41465: dumping result to json 11728 1726882224.41472: done dumping result, returning 11728 1726882224.41483: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-5c28-a762-000000000a36] 11728 1726882224.41492: sending task result for task 12673a56-9f93-5c28-a762-000000000a36 11728 1726882224.41968: done sending task result for task 12673a56-9f93-5c28-a762-000000000a36 11728 1726882224.41971: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11728 1726882224.42043: no more pending results, returning what we have 11728 1726882224.42047: results queue empty 11728 1726882224.42048: checking for any_errors_fatal 11728 1726882224.42054: done checking for any_errors_fatal 11728 1726882224.42054: checking for max_fail_percentage 11728 1726882224.42056: done checking for max_fail_percentage 11728 1726882224.42057: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.42058: done checking to see if all hosts have failed 11728 1726882224.42058: getting the remaining hosts for this loop 11728 1726882224.42060: done getting the remaining hosts for this loop 11728 1726882224.42063: getting the next task for host managed_node3 11728 1726882224.42070: done getting next task for host managed_node3 11728 1726882224.42073: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882224.42078: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.42101: getting variables 11728 1726882224.42102: in VariableManager get_vars() 11728 1726882224.42144: Calling all_inventory to load vars for managed_node3 11728 1726882224.42147: Calling groups_inventory to load vars for managed_node3 11728 1726882224.42149: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.42158: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.42160: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.42163: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.44407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.46019: done with get_vars() 11728 1726882224.46046: done getting variables 11728 1726882224.46114: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:30:24 -0400 (0:00:00.165) 0:00:49.314 ****** 11728 1726882224.46163: entering _queue_task() for managed_node3/package 11728 1726882224.46500: worker is 1 (out of 1 available) 11728 1726882224.46511: exiting _queue_task() for managed_node3/package 11728 1726882224.46524: done queuing things up, now waiting for results queue to drain 11728 1726882224.46526: waiting for pending results... 11728 1726882224.46832: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882224.47139: in run() - task 12673a56-9f93-5c28-a762-000000000a37 11728 1726882224.47165: variable 'ansible_search_path' from source: unknown 11728 1726882224.47317: variable 'ansible_search_path' from source: unknown 11728 1726882224.47368: calling self._execute() 11728 1726882224.47699: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.47703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.47706: variable 'omit' from source: magic vars 11728 1726882224.48509: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.48526: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.48679: variable 'network_state' from source: role '' defaults 11728 1726882224.48700: Evaluated conditional (network_state != {}): False 11728 1726882224.48817: when evaluation is False, skipping this task 11728 1726882224.48821: _execute() done 11728 1726882224.48823: dumping result to json 11728 1726882224.48826: done dumping result, returning 11728 1726882224.48828: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000a37] 11728 1726882224.48831: sending task result for task 12673a56-9f93-5c28-a762-000000000a37 11728 1726882224.48912: done sending task result for task 12673a56-9f93-5c28-a762-000000000a37 11728 1726882224.48915: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882224.48968: no more pending results, returning what we have 11728 1726882224.48973: results queue empty 11728 1726882224.48974: checking for any_errors_fatal 11728 1726882224.48979: done checking for any_errors_fatal 11728 1726882224.48980: checking for max_fail_percentage 11728 1726882224.48982: done checking for max_fail_percentage 11728 1726882224.48983: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.48983: done checking to see if all hosts have failed 11728 1726882224.48984: getting the remaining hosts for this loop 11728 1726882224.48986: done getting the remaining hosts for this loop 11728 1726882224.48989: getting the next task for host managed_node3 11728 1726882224.49001: done getting next task for host managed_node3 11728 1726882224.49005: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882224.49011: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.49034: getting variables 11728 1726882224.49036: in VariableManager get_vars() 11728 1726882224.49080: Calling all_inventory to load vars for managed_node3 11728 1726882224.49083: Calling groups_inventory to load vars for managed_node3 11728 1726882224.49086: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.49202: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.49206: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.49210: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.50856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.52428: done with get_vars() 11728 1726882224.52452: done getting variables 11728 1726882224.52512: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:30:24 -0400 (0:00:00.063) 0:00:49.377 ****** 11728 1726882224.52546: entering _queue_task() for managed_node3/package 11728 1726882224.52834: worker is 1 (out of 1 available) 11728 1726882224.52846: exiting _queue_task() for managed_node3/package 11728 1726882224.52858: done queuing things up, now waiting for results queue to drain 11728 1726882224.52859: waiting for pending results... 11728 1726882224.53220: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882224.53319: in run() - task 12673a56-9f93-5c28-a762-000000000a38 11728 1726882224.53324: variable 'ansible_search_path' from source: unknown 11728 1726882224.53331: variable 'ansible_search_path' from source: unknown 11728 1726882224.53367: calling self._execute() 11728 1726882224.53500: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.53505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.53508: variable 'omit' from source: magic vars 11728 1726882224.53858: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.54101: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.54104: variable 'network_state' from source: role '' defaults 11728 1726882224.54107: Evaluated conditional (network_state != {}): False 11728 1726882224.54109: when evaluation is False, skipping this task 11728 1726882224.54111: _execute() done 11728 1726882224.54114: dumping result to json 11728 1726882224.54116: done dumping result, returning 11728 1726882224.54118: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000a38] 11728 1726882224.54121: sending task result for task 12673a56-9f93-5c28-a762-000000000a38 11728 1726882224.54192: done sending task result for task 12673a56-9f93-5c28-a762-000000000a38 11728 1726882224.54200: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882224.54252: no more pending results, returning what we have 11728 1726882224.54256: results queue empty 11728 1726882224.54257: checking for any_errors_fatal 11728 1726882224.54266: done checking for any_errors_fatal 11728 1726882224.54266: checking for max_fail_percentage 11728 1726882224.54268: done checking for max_fail_percentage 11728 1726882224.54270: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.54270: done checking to see if all hosts have failed 11728 1726882224.54271: getting the remaining hosts for this loop 11728 1726882224.54273: done getting the remaining hosts for this loop 11728 1726882224.54277: getting the next task for host managed_node3 11728 1726882224.54288: done getting next task for host managed_node3 11728 1726882224.54292: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882224.54302: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.54327: getting variables 11728 1726882224.54329: in VariableManager get_vars() 11728 1726882224.54375: Calling all_inventory to load vars for managed_node3 11728 1726882224.54378: Calling groups_inventory to load vars for managed_node3 11728 1726882224.54381: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.54397: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.54401: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.54404: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.56008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.58557: done with get_vars() 11728 1726882224.58590: done getting variables 11728 1726882224.58859: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:30:24 -0400 (0:00:00.063) 0:00:49.441 ****** 11728 1726882224.58905: entering _queue_task() for managed_node3/service 11728 1726882224.59690: worker is 1 (out of 1 available) 11728 1726882224.59707: exiting _queue_task() for managed_node3/service 11728 1726882224.59720: done queuing things up, now waiting for results queue to drain 11728 1726882224.59721: waiting for pending results... 11728 1726882224.60534: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882224.60676: in run() - task 12673a56-9f93-5c28-a762-000000000a39 11728 1726882224.60701: variable 'ansible_search_path' from source: unknown 11728 1726882224.60710: variable 'ansible_search_path' from source: unknown 11728 1726882224.60750: calling self._execute() 11728 1726882224.60855: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.60866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.60885: variable 'omit' from source: magic vars 11728 1726882224.61271: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.61291: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.61427: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.61629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.64127: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.64200: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.64240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.64284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.64321: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.64407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.64446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.64481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.64530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.64549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.64608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.64640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.64688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.64724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.64799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.64803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.64820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.64848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.64888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.64917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.65100: variable 'network_connections' from source: task vars 11728 1726882224.65125: variable 'controller_profile' from source: play vars 11728 1726882224.65201: variable 'controller_profile' from source: play vars 11728 1726882224.65235: variable 'controller_device' from source: play vars 11728 1726882224.65296: variable 'controller_device' from source: play vars 11728 1726882224.65344: variable 'dhcp_interface1' from source: play vars 11728 1726882224.65379: variable 'dhcp_interface1' from source: play vars 11728 1726882224.65392: variable 'port1_profile' from source: play vars 11728 1726882224.65472: variable 'port1_profile' from source: play vars 11728 1726882224.65486: variable 'dhcp_interface1' from source: play vars 11728 1726882224.65562: variable 'dhcp_interface1' from source: play vars 11728 1726882224.65669: variable 'controller_profile' from source: play vars 11728 1726882224.65673: variable 'controller_profile' from source: play vars 11728 1726882224.65675: variable 'port2_profile' from source: play vars 11728 1726882224.65713: variable 'port2_profile' from source: play vars 11728 1726882224.65725: variable 'dhcp_interface2' from source: play vars 11728 1726882224.65791: variable 'dhcp_interface2' from source: play vars 11728 1726882224.65808: variable 'controller_profile' from source: play vars 11728 1726882224.65883: variable 'controller_profile' from source: play vars 11728 1726882224.65965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882224.66155: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882224.66199: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882224.66244: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882224.66281: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882224.66342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882224.66446: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882224.66449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.66461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882224.66523: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882224.66741: variable 'network_connections' from source: task vars 11728 1726882224.66753: variable 'controller_profile' from source: play vars 11728 1726882224.66825: variable 'controller_profile' from source: play vars 11728 1726882224.66837: variable 'controller_device' from source: play vars 11728 1726882224.66904: variable 'controller_device' from source: play vars 11728 1726882224.66916: variable 'dhcp_interface1' from source: play vars 11728 1726882224.67300: variable 'dhcp_interface1' from source: play vars 11728 1726882224.67303: variable 'port1_profile' from source: play vars 11728 1726882224.67306: variable 'port1_profile' from source: play vars 11728 1726882224.67307: variable 'dhcp_interface1' from source: play vars 11728 1726882224.67328: variable 'dhcp_interface1' from source: play vars 11728 1726882224.67339: variable 'controller_profile' from source: play vars 11728 1726882224.67403: variable 'controller_profile' from source: play vars 11728 1726882224.67639: variable 'port2_profile' from source: play vars 11728 1726882224.67642: variable 'port2_profile' from source: play vars 11728 1726882224.67644: variable 'dhcp_interface2' from source: play vars 11728 1726882224.67738: variable 'dhcp_interface2' from source: play vars 11728 1726882224.67809: variable 'controller_profile' from source: play vars 11728 1726882224.67874: variable 'controller_profile' from source: play vars 11728 1726882224.67954: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882224.67998: when evaluation is False, skipping this task 11728 1726882224.68007: _execute() done 11728 1726882224.68079: dumping result to json 11728 1726882224.68086: done dumping result, returning 11728 1726882224.68102: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000a39] 11728 1726882224.68112: sending task result for task 12673a56-9f93-5c28-a762-000000000a39 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882224.68259: no more pending results, returning what we have 11728 1726882224.68263: results queue empty 11728 1726882224.68264: checking for any_errors_fatal 11728 1726882224.68269: done checking for any_errors_fatal 11728 1726882224.68270: checking for max_fail_percentage 11728 1726882224.68272: done checking for max_fail_percentage 11728 1726882224.68273: checking to see if all hosts have failed and the running result is not ok 11728 1726882224.68274: done checking to see if all hosts have failed 11728 1726882224.68275: getting the remaining hosts for this loop 11728 1726882224.68276: done getting the remaining hosts for this loop 11728 1726882224.68280: getting the next task for host managed_node3 11728 1726882224.68287: done getting next task for host managed_node3 11728 1726882224.68291: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882224.68300: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882224.68322: getting variables 11728 1726882224.68324: in VariableManager get_vars() 11728 1726882224.68367: Calling all_inventory to load vars for managed_node3 11728 1726882224.68370: Calling groups_inventory to load vars for managed_node3 11728 1726882224.68372: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882224.68383: Calling all_plugins_play to load vars for managed_node3 11728 1726882224.68386: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882224.68389: Calling groups_plugins_play to load vars for managed_node3 11728 1726882224.69652: done sending task result for task 12673a56-9f93-5c28-a762-000000000a39 11728 1726882224.69655: WORKER PROCESS EXITING 11728 1726882224.71558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882224.73246: done with get_vars() 11728 1726882224.73270: done getting variables 11728 1726882224.73334: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:30:24 -0400 (0:00:00.144) 0:00:49.586 ****** 11728 1726882224.73369: entering _queue_task() for managed_node3/service 11728 1726882224.73717: worker is 1 (out of 1 available) 11728 1726882224.73730: exiting _queue_task() for managed_node3/service 11728 1726882224.73742: done queuing things up, now waiting for results queue to drain 11728 1726882224.73743: waiting for pending results... 11728 1726882224.74034: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882224.74241: in run() - task 12673a56-9f93-5c28-a762-000000000a3a 11728 1726882224.74282: variable 'ansible_search_path' from source: unknown 11728 1726882224.74290: variable 'ansible_search_path' from source: unknown 11728 1726882224.74339: calling self._execute() 11728 1726882224.74578: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.74592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.74802: variable 'omit' from source: magic vars 11728 1726882224.75006: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.75026: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882224.75192: variable 'network_provider' from source: set_fact 11728 1726882224.75209: variable 'network_state' from source: role '' defaults 11728 1726882224.75225: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11728 1726882224.75235: variable 'omit' from source: magic vars 11728 1726882224.75316: variable 'omit' from source: magic vars 11728 1726882224.75347: variable 'network_service_name' from source: role '' defaults 11728 1726882224.75418: variable 'network_service_name' from source: role '' defaults 11728 1726882224.75681: variable '__network_provider_setup' from source: role '' defaults 11728 1726882224.75684: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882224.75686: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882224.75688: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882224.75751: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882224.76053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882224.78684: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882224.78792: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882224.78905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882224.78961: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882224.78995: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882224.79118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.79155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.79189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.79286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.79290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.79323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.79353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.79384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.79431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.79453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.79692: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882224.79818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.79898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.79901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.79913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.79938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.80035: variable 'ansible_python' from source: facts 11728 1726882224.80055: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882224.80143: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882224.80225: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882224.80396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.80477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.80533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.80575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.80598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.80652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882224.80748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882224.80752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.80818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882224.80838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882224.80997: variable 'network_connections' from source: task vars 11728 1726882224.81019: variable 'controller_profile' from source: play vars 11728 1726882224.81128: variable 'controller_profile' from source: play vars 11728 1726882224.81131: variable 'controller_device' from source: play vars 11728 1726882224.81195: variable 'controller_device' from source: play vars 11728 1726882224.81214: variable 'dhcp_interface1' from source: play vars 11728 1726882224.81295: variable 'dhcp_interface1' from source: play vars 11728 1726882224.81314: variable 'port1_profile' from source: play vars 11728 1726882224.81402: variable 'port1_profile' from source: play vars 11728 1726882224.81453: variable 'dhcp_interface1' from source: play vars 11728 1726882224.81489: variable 'dhcp_interface1' from source: play vars 11728 1726882224.81512: variable 'controller_profile' from source: play vars 11728 1726882224.81587: variable 'controller_profile' from source: play vars 11728 1726882224.81606: variable 'port2_profile' from source: play vars 11728 1726882224.81688: variable 'port2_profile' from source: play vars 11728 1726882224.81710: variable 'dhcp_interface2' from source: play vars 11728 1726882224.81836: variable 'dhcp_interface2' from source: play vars 11728 1726882224.81839: variable 'controller_profile' from source: play vars 11728 1726882224.81884: variable 'controller_profile' from source: play vars 11728 1726882224.82004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882224.82197: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882224.82248: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882224.82313: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882224.82358: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882224.82560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882224.82563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882224.82565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882224.82600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882224.82651: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.83231: variable 'network_connections' from source: task vars 11728 1726882224.83237: variable 'controller_profile' from source: play vars 11728 1726882224.83424: variable 'controller_profile' from source: play vars 11728 1726882224.83436: variable 'controller_device' from source: play vars 11728 1726882224.83610: variable 'controller_device' from source: play vars 11728 1726882224.83627: variable 'dhcp_interface1' from source: play vars 11728 1726882224.83691: variable 'dhcp_interface1' from source: play vars 11728 1726882224.83763: variable 'port1_profile' from source: play vars 11728 1726882224.83840: variable 'port1_profile' from source: play vars 11728 1726882224.83906: variable 'dhcp_interface1' from source: play vars 11728 1726882224.84266: variable 'dhcp_interface1' from source: play vars 11728 1726882224.84342: variable 'controller_profile' from source: play vars 11728 1726882224.84440: variable 'controller_profile' from source: play vars 11728 1726882224.84450: variable 'port2_profile' from source: play vars 11728 1726882224.84622: variable 'port2_profile' from source: play vars 11728 1726882224.84633: variable 'dhcp_interface2' from source: play vars 11728 1726882224.84907: variable 'dhcp_interface2' from source: play vars 11728 1726882224.84920: variable 'controller_profile' from source: play vars 11728 1726882224.84987: variable 'controller_profile' from source: play vars 11728 1726882224.85043: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882224.85121: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882224.85558: variable 'network_connections' from source: task vars 11728 1726882224.85564: variable 'controller_profile' from source: play vars 11728 1726882224.85743: variable 'controller_profile' from source: play vars 11728 1726882224.85751: variable 'controller_device' from source: play vars 11728 1726882224.85818: variable 'controller_device' from source: play vars 11728 1726882224.85826: variable 'dhcp_interface1' from source: play vars 11728 1726882224.85904: variable 'dhcp_interface1' from source: play vars 11728 1726882224.85915: variable 'port1_profile' from source: play vars 11728 1726882224.86002: variable 'port1_profile' from source: play vars 11728 1726882224.86005: variable 'dhcp_interface1' from source: play vars 11728 1726882224.86265: variable 'dhcp_interface1' from source: play vars 11728 1726882224.86271: variable 'controller_profile' from source: play vars 11728 1726882224.86343: variable 'controller_profile' from source: play vars 11728 1726882224.86349: variable 'port2_profile' from source: play vars 11728 1726882224.86423: variable 'port2_profile' from source: play vars 11728 1726882224.86426: variable 'dhcp_interface2' from source: play vars 11728 1726882224.86492: variable 'dhcp_interface2' from source: play vars 11728 1726882224.86605: variable 'controller_profile' from source: play vars 11728 1726882224.86670: variable 'controller_profile' from source: play vars 11728 1726882224.86744: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882224.86879: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882224.87373: variable 'network_connections' from source: task vars 11728 1726882224.87376: variable 'controller_profile' from source: play vars 11728 1726882224.87500: variable 'controller_profile' from source: play vars 11728 1726882224.87507: variable 'controller_device' from source: play vars 11728 1726882224.87572: variable 'controller_device' from source: play vars 11728 1726882224.87613: variable 'dhcp_interface1' from source: play vars 11728 1726882224.87680: variable 'dhcp_interface1' from source: play vars 11728 1726882224.87689: variable 'port1_profile' from source: play vars 11728 1726882224.87828: variable 'port1_profile' from source: play vars 11728 1726882224.87832: variable 'dhcp_interface1' from source: play vars 11728 1726882224.87834: variable 'dhcp_interface1' from source: play vars 11728 1726882224.87836: variable 'controller_profile' from source: play vars 11728 1726882224.87906: variable 'controller_profile' from source: play vars 11728 1726882224.87915: variable 'port2_profile' from source: play vars 11728 1726882224.87977: variable 'port2_profile' from source: play vars 11728 1726882224.87984: variable 'dhcp_interface2' from source: play vars 11728 1726882224.88054: variable 'dhcp_interface2' from source: play vars 11728 1726882224.88061: variable 'controller_profile' from source: play vars 11728 1726882224.88131: variable 'controller_profile' from source: play vars 11728 1726882224.88191: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882224.88264: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882224.88267: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882224.88373: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882224.88529: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882224.89015: variable 'network_connections' from source: task vars 11728 1726882224.89018: variable 'controller_profile' from source: play vars 11728 1726882224.89137: variable 'controller_profile' from source: play vars 11728 1726882224.89140: variable 'controller_device' from source: play vars 11728 1726882224.89143: variable 'controller_device' from source: play vars 11728 1726882224.89148: variable 'dhcp_interface1' from source: play vars 11728 1726882224.89280: variable 'dhcp_interface1' from source: play vars 11728 1726882224.89287: variable 'port1_profile' from source: play vars 11728 1726882224.89446: variable 'port1_profile' from source: play vars 11728 1726882224.89449: variable 'dhcp_interface1' from source: play vars 11728 1726882224.89451: variable 'dhcp_interface1' from source: play vars 11728 1726882224.89453: variable 'controller_profile' from source: play vars 11728 1726882224.89474: variable 'controller_profile' from source: play vars 11728 1726882224.89481: variable 'port2_profile' from source: play vars 11728 1726882224.89544: variable 'port2_profile' from source: play vars 11728 1726882224.89547: variable 'dhcp_interface2' from source: play vars 11728 1726882224.89811: variable 'dhcp_interface2' from source: play vars 11728 1726882224.89814: variable 'controller_profile' from source: play vars 11728 1726882224.89871: variable 'controller_profile' from source: play vars 11728 1726882224.89879: variable 'ansible_distribution' from source: facts 11728 1726882224.89882: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.89889: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.90027: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882224.90502: variable 'ansible_distribution' from source: facts 11728 1726882224.90506: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.90512: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.90524: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882224.90910: variable 'ansible_distribution' from source: facts 11728 1726882224.90913: variable '__network_rh_distros' from source: role '' defaults 11728 1726882224.90915: variable 'ansible_distribution_major_version' from source: facts 11728 1726882224.90917: variable 'network_provider' from source: set_fact 11728 1726882224.90927: variable 'omit' from source: magic vars 11728 1726882224.90954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882224.90982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882224.91201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882224.91205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882224.91207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882224.91210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882224.91212: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.91214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.91278: Set connection var ansible_connection to ssh 11728 1726882224.91309: Set connection var ansible_shell_executable to /bin/sh 11728 1726882224.91312: Set connection var ansible_timeout to 10 11728 1726882224.91314: Set connection var ansible_shell_type to sh 11728 1726882224.91316: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882224.91318: Set connection var ansible_pipelining to False 11728 1726882224.91334: variable 'ansible_shell_executable' from source: unknown 11728 1726882224.91337: variable 'ansible_connection' from source: unknown 11728 1726882224.91340: variable 'ansible_module_compression' from source: unknown 11728 1726882224.91342: variable 'ansible_shell_type' from source: unknown 11728 1726882224.91344: variable 'ansible_shell_executable' from source: unknown 11728 1726882224.91346: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882224.91351: variable 'ansible_pipelining' from source: unknown 11728 1726882224.91354: variable 'ansible_timeout' from source: unknown 11728 1726882224.91358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882224.91467: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882224.91477: variable 'omit' from source: magic vars 11728 1726882224.91483: starting attempt loop 11728 1726882224.91486: running the handler 11728 1726882224.91636: variable 'ansible_facts' from source: unknown 11728 1726882224.92465: _low_level_execute_command(): starting 11728 1726882224.92474: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882224.93209: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882224.93225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882224.93299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882224.93453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882224.95023: stdout chunk (state=3): >>>/root <<< 11728 1726882224.95146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882224.95149: stdout chunk (state=3): >>><<< 11728 1726882224.95162: stderr chunk (state=3): >>><<< 11728 1726882224.95177: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882224.95188: _low_level_execute_command(): starting 11728 1726882224.95200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746 `" && echo ansible-tmp-1726882224.9517663-14175-99186912542746="` echo /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746 `" ) && sleep 0' 11728 1726882224.96515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882224.96624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882224.96649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882224.96667: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882224.96740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882224.97008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882224.98811: stdout chunk (state=3): >>>ansible-tmp-1726882224.9517663-14175-99186912542746=/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746 <<< 11728 1726882224.98947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882224.98957: stdout chunk (state=3): >>><<< 11728 1726882224.98970: stderr chunk (state=3): >>><<< 11728 1726882224.98989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882224.9517663-14175-99186912542746=/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882224.99026: variable 'ansible_module_compression' from source: unknown 11728 1726882224.99085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11728 1726882224.99155: variable 'ansible_facts' from source: unknown 11728 1726882224.99385: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py 11728 1726882224.99540: Sending initial data 11728 1726882224.99549: Sent initial data (155 bytes) 11728 1726882225.00152: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882225.00228: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.00307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882225.00342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.00375: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.00468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.01978: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882225.02041: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882225.02085: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqszunxt1 /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py <<< 11728 1726882225.02098: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py" <<< 11728 1726882225.02129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpqszunxt1" to remote "/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py" <<< 11728 1726882225.02134: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py" <<< 11728 1726882225.03211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.03246: stderr chunk (state=3): >>><<< 11728 1726882225.03249: stdout chunk (state=3): >>><<< 11728 1726882225.03264: done transferring module to remote 11728 1726882225.03273: _low_level_execute_command(): starting 11728 1726882225.03277: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/ /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py && sleep 0' 11728 1726882225.03698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882225.03705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882225.03707: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882225.03709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882225.03712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.03798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.03817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.03905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.05635: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.05656: stderr chunk (state=3): >>><<< 11728 1726882225.05659: stdout chunk (state=3): >>><<< 11728 1726882225.05671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882225.05674: _low_level_execute_command(): starting 11728 1726882225.05678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/AnsiballZ_systemd.py && sleep 0' 11728 1726882225.06067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882225.06071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.06086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.06149: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.06153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.06201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.34828: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305893888", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "870355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11728 1726882225.34852: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11728 1726882225.36800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882225.36803: stdout chunk (state=3): >>><<< 11728 1726882225.36806: stderr chunk (state=3): >>><<< 11728 1726882225.36810: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3305893888", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "870355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882225.37157: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882225.37232: _low_level_execute_command(): starting 11728 1726882225.37388: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882224.9517663-14175-99186912542746/ > /dev/null 2>&1 && sleep 0' 11728 1726882225.38426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882225.38475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882225.38488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882225.38505: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.38738: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882225.38748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.38801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.38873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.40708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.40760: stderr chunk (state=3): >>><<< 11728 1726882225.40763: stdout chunk (state=3): >>><<< 11728 1726882225.40801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882225.40804: handler run complete 11728 1726882225.40829: attempt loop complete, returning result 11728 1726882225.40832: _execute() done 11728 1726882225.40834: dumping result to json 11728 1726882225.40846: done dumping result, returning 11728 1726882225.40854: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-5c28-a762-000000000a3a] 11728 1726882225.40859: sending task result for task 12673a56-9f93-5c28-a762-000000000a3a 11728 1726882225.41067: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3a 11728 1726882225.41071: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882225.41135: no more pending results, returning what we have 11728 1726882225.41139: results queue empty 11728 1726882225.41140: checking for any_errors_fatal 11728 1726882225.41146: done checking for any_errors_fatal 11728 1726882225.41147: checking for max_fail_percentage 11728 1726882225.41149: done checking for max_fail_percentage 11728 1726882225.41150: checking to see if all hosts have failed and the running result is not ok 11728 1726882225.41150: done checking to see if all hosts have failed 11728 1726882225.41151: getting the remaining hosts for this loop 11728 1726882225.41153: done getting the remaining hosts for this loop 11728 1726882225.41156: getting the next task for host managed_node3 11728 1726882225.41162: done getting next task for host managed_node3 11728 1726882225.41166: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882225.41171: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882225.41181: getting variables 11728 1726882225.41183: in VariableManager get_vars() 11728 1726882225.41257: Calling all_inventory to load vars for managed_node3 11728 1726882225.41260: Calling groups_inventory to load vars for managed_node3 11728 1726882225.41262: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882225.41271: Calling all_plugins_play to load vars for managed_node3 11728 1726882225.41274: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882225.41276: Calling groups_plugins_play to load vars for managed_node3 11728 1726882225.42290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882225.43514: done with get_vars() 11728 1726882225.43530: done getting variables 11728 1726882225.43572: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:30:25 -0400 (0:00:00.702) 0:00:50.288 ****** 11728 1726882225.43604: entering _queue_task() for managed_node3/service 11728 1726882225.43854: worker is 1 (out of 1 available) 11728 1726882225.43867: exiting _queue_task() for managed_node3/service 11728 1726882225.43879: done queuing things up, now waiting for results queue to drain 11728 1726882225.43880: waiting for pending results... 11728 1726882225.44062: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882225.44168: in run() - task 12673a56-9f93-5c28-a762-000000000a3b 11728 1726882225.44179: variable 'ansible_search_path' from source: unknown 11728 1726882225.44182: variable 'ansible_search_path' from source: unknown 11728 1726882225.44218: calling self._execute() 11728 1726882225.44287: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.44292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.44304: variable 'omit' from source: magic vars 11728 1726882225.44590: variable 'ansible_distribution_major_version' from source: facts 11728 1726882225.44604: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882225.44687: variable 'network_provider' from source: set_fact 11728 1726882225.44690: Evaluated conditional (network_provider == "nm"): True 11728 1726882225.44777: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882225.45000: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882225.45040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882225.46847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882225.46895: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882225.46924: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882225.46950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882225.46978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882225.47044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882225.47065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882225.47085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882225.47117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882225.47128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882225.47161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882225.47177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882225.47200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882225.47227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882225.47238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882225.47266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882225.47283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882225.47308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882225.47331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882225.47342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882225.47440: variable 'network_connections' from source: task vars 11728 1726882225.47450: variable 'controller_profile' from source: play vars 11728 1726882225.47497: variable 'controller_profile' from source: play vars 11728 1726882225.47509: variable 'controller_device' from source: play vars 11728 1726882225.47555: variable 'controller_device' from source: play vars 11728 1726882225.47563: variable 'dhcp_interface1' from source: play vars 11728 1726882225.47608: variable 'dhcp_interface1' from source: play vars 11728 1726882225.47614: variable 'port1_profile' from source: play vars 11728 1726882225.47658: variable 'port1_profile' from source: play vars 11728 1726882225.47665: variable 'dhcp_interface1' from source: play vars 11728 1726882225.47710: variable 'dhcp_interface1' from source: play vars 11728 1726882225.47715: variable 'controller_profile' from source: play vars 11728 1726882225.47765: variable 'controller_profile' from source: play vars 11728 1726882225.47797: variable 'port2_profile' from source: play vars 11728 1726882225.47856: variable 'port2_profile' from source: play vars 11728 1726882225.47860: variable 'dhcp_interface2' from source: play vars 11728 1726882225.47957: variable 'dhcp_interface2' from source: play vars 11728 1726882225.47960: variable 'controller_profile' from source: play vars 11728 1726882225.48002: variable 'controller_profile' from source: play vars 11728 1726882225.48063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882225.48400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882225.48403: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882225.48405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882225.48407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882225.48409: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882225.48412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882225.48456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882225.48498: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882225.48566: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882225.48863: variable 'network_connections' from source: task vars 11728 1726882225.48866: variable 'controller_profile' from source: play vars 11728 1726882225.48868: variable 'controller_profile' from source: play vars 11728 1726882225.48879: variable 'controller_device' from source: play vars 11728 1726882225.48941: variable 'controller_device' from source: play vars 11728 1726882225.48955: variable 'dhcp_interface1' from source: play vars 11728 1726882225.49023: variable 'dhcp_interface1' from source: play vars 11728 1726882225.49037: variable 'port1_profile' from source: play vars 11728 1726882225.49106: variable 'port1_profile' from source: play vars 11728 1726882225.49119: variable 'dhcp_interface1' from source: play vars 11728 1726882225.49178: variable 'dhcp_interface1' from source: play vars 11728 1726882225.49198: variable 'controller_profile' from source: play vars 11728 1726882225.49257: variable 'controller_profile' from source: play vars 11728 1726882225.49299: variable 'port2_profile' from source: play vars 11728 1726882225.49336: variable 'port2_profile' from source: play vars 11728 1726882225.49347: variable 'dhcp_interface2' from source: play vars 11728 1726882225.49413: variable 'dhcp_interface2' from source: play vars 11728 1726882225.49501: variable 'controller_profile' from source: play vars 11728 1726882225.49505: variable 'controller_profile' from source: play vars 11728 1726882225.49540: Evaluated conditional (__network_wpa_supplicant_required): False 11728 1726882225.49548: when evaluation is False, skipping this task 11728 1726882225.49555: _execute() done 11728 1726882225.49561: dumping result to json 11728 1726882225.49600: done dumping result, returning 11728 1726882225.49603: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-5c28-a762-000000000a3b] 11728 1726882225.49606: sending task result for task 12673a56-9f93-5c28-a762-000000000a3b 11728 1726882225.49711: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3b 11728 1726882225.49715: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11728 1726882225.49768: no more pending results, returning what we have 11728 1726882225.49772: results queue empty 11728 1726882225.49773: checking for any_errors_fatal 11728 1726882225.49791: done checking for any_errors_fatal 11728 1726882225.49792: checking for max_fail_percentage 11728 1726882225.49799: done checking for max_fail_percentage 11728 1726882225.49800: checking to see if all hosts have failed and the running result is not ok 11728 1726882225.49800: done checking to see if all hosts have failed 11728 1726882225.49801: getting the remaining hosts for this loop 11728 1726882225.49803: done getting the remaining hosts for this loop 11728 1726882225.49806: getting the next task for host managed_node3 11728 1726882225.49814: done getting next task for host managed_node3 11728 1726882225.49817: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882225.49822: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882225.49843: getting variables 11728 1726882225.49845: in VariableManager get_vars() 11728 1726882225.49886: Calling all_inventory to load vars for managed_node3 11728 1726882225.49889: Calling groups_inventory to load vars for managed_node3 11728 1726882225.49891: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882225.49916: Calling all_plugins_play to load vars for managed_node3 11728 1726882225.49920: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882225.49924: Calling groups_plugins_play to load vars for managed_node3 11728 1726882225.50756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882225.51644: done with get_vars() 11728 1726882225.51673: done getting variables 11728 1726882225.51735: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:30:25 -0400 (0:00:00.081) 0:00:50.370 ****** 11728 1726882225.51770: entering _queue_task() for managed_node3/service 11728 1726882225.52117: worker is 1 (out of 1 available) 11728 1726882225.52131: exiting _queue_task() for managed_node3/service 11728 1726882225.52142: done queuing things up, now waiting for results queue to drain 11728 1726882225.52144: waiting for pending results... 11728 1726882225.52518: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882225.52614: in run() - task 12673a56-9f93-5c28-a762-000000000a3c 11728 1726882225.52639: variable 'ansible_search_path' from source: unknown 11728 1726882225.52652: variable 'ansible_search_path' from source: unknown 11728 1726882225.52735: calling self._execute() 11728 1726882225.52789: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.52795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.52806: variable 'omit' from source: magic vars 11728 1726882225.53084: variable 'ansible_distribution_major_version' from source: facts 11728 1726882225.53172: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882225.53238: variable 'network_provider' from source: set_fact 11728 1726882225.53243: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882225.53249: when evaluation is False, skipping this task 11728 1726882225.53252: _execute() done 11728 1726882225.53254: dumping result to json 11728 1726882225.53257: done dumping result, returning 11728 1726882225.53264: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-5c28-a762-000000000a3c] 11728 1726882225.53269: sending task result for task 12673a56-9f93-5c28-a762-000000000a3c 11728 1726882225.53355: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3c 11728 1726882225.53357: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882225.53438: no more pending results, returning what we have 11728 1726882225.53442: results queue empty 11728 1726882225.53443: checking for any_errors_fatal 11728 1726882225.53451: done checking for any_errors_fatal 11728 1726882225.53451: checking for max_fail_percentage 11728 1726882225.53453: done checking for max_fail_percentage 11728 1726882225.53454: checking to see if all hosts have failed and the running result is not ok 11728 1726882225.53454: done checking to see if all hosts have failed 11728 1726882225.53455: getting the remaining hosts for this loop 11728 1726882225.53457: done getting the remaining hosts for this loop 11728 1726882225.53460: getting the next task for host managed_node3 11728 1726882225.53469: done getting next task for host managed_node3 11728 1726882225.53473: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882225.53479: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882225.53500: getting variables 11728 1726882225.53502: in VariableManager get_vars() 11728 1726882225.53536: Calling all_inventory to load vars for managed_node3 11728 1726882225.53539: Calling groups_inventory to load vars for managed_node3 11728 1726882225.53541: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882225.53549: Calling all_plugins_play to load vars for managed_node3 11728 1726882225.53551: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882225.53554: Calling groups_plugins_play to load vars for managed_node3 11728 1726882225.54458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882225.55553: done with get_vars() 11728 1726882225.55579: done getting variables 11728 1726882225.55642: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:30:25 -0400 (0:00:00.039) 0:00:50.409 ****** 11728 1726882225.55680: entering _queue_task() for managed_node3/copy 11728 1726882225.56033: worker is 1 (out of 1 available) 11728 1726882225.56049: exiting _queue_task() for managed_node3/copy 11728 1726882225.56070: done queuing things up, now waiting for results queue to drain 11728 1726882225.56072: waiting for pending results... 11728 1726882225.56641: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882225.56747: in run() - task 12673a56-9f93-5c28-a762-000000000a3d 11728 1726882225.56769: variable 'ansible_search_path' from source: unknown 11728 1726882225.56778: variable 'ansible_search_path' from source: unknown 11728 1726882225.56826: calling self._execute() 11728 1726882225.56943: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.56963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.56979: variable 'omit' from source: magic vars 11728 1726882225.57379: variable 'ansible_distribution_major_version' from source: facts 11728 1726882225.57404: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882225.57585: variable 'network_provider' from source: set_fact 11728 1726882225.57589: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882225.57591: when evaluation is False, skipping this task 11728 1726882225.57600: _execute() done 11728 1726882225.57602: dumping result to json 11728 1726882225.57608: done dumping result, returning 11728 1726882225.57611: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-5c28-a762-000000000a3d] 11728 1726882225.57613: sending task result for task 12673a56-9f93-5c28-a762-000000000a3d 11728 1726882225.57685: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3d 11728 1726882225.57688: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11728 1726882225.57755: no more pending results, returning what we have 11728 1726882225.57758: results queue empty 11728 1726882225.57759: checking for any_errors_fatal 11728 1726882225.57764: done checking for any_errors_fatal 11728 1726882225.57765: checking for max_fail_percentage 11728 1726882225.57766: done checking for max_fail_percentage 11728 1726882225.57767: checking to see if all hosts have failed and the running result is not ok 11728 1726882225.57768: done checking to see if all hosts have failed 11728 1726882225.57769: getting the remaining hosts for this loop 11728 1726882225.57770: done getting the remaining hosts for this loop 11728 1726882225.57773: getting the next task for host managed_node3 11728 1726882225.57787: done getting next task for host managed_node3 11728 1726882225.57791: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882225.57800: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882225.57818: getting variables 11728 1726882225.57819: in VariableManager get_vars() 11728 1726882225.57856: Calling all_inventory to load vars for managed_node3 11728 1726882225.57858: Calling groups_inventory to load vars for managed_node3 11728 1726882225.57860: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882225.57869: Calling all_plugins_play to load vars for managed_node3 11728 1726882225.57871: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882225.57873: Calling groups_plugins_play to load vars for managed_node3 11728 1726882225.58919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882225.60430: done with get_vars() 11728 1726882225.60460: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:30:25 -0400 (0:00:00.048) 0:00:50.457 ****** 11728 1726882225.60555: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882225.60906: worker is 1 (out of 1 available) 11728 1726882225.60919: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882225.60932: done queuing things up, now waiting for results queue to drain 11728 1726882225.60933: waiting for pending results... 11728 1726882225.61259: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882225.61371: in run() - task 12673a56-9f93-5c28-a762-000000000a3e 11728 1726882225.61392: variable 'ansible_search_path' from source: unknown 11728 1726882225.61410: variable 'ansible_search_path' from source: unknown 11728 1726882225.61446: calling self._execute() 11728 1726882225.61628: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.61633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.61635: variable 'omit' from source: magic vars 11728 1726882225.61940: variable 'ansible_distribution_major_version' from source: facts 11728 1726882225.61959: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882225.61968: variable 'omit' from source: magic vars 11728 1726882225.62037: variable 'omit' from source: magic vars 11728 1726882225.62205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882225.64652: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882225.64727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882225.64767: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882225.64814: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882225.64847: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882225.64931: variable 'network_provider' from source: set_fact 11728 1726882225.65101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882225.65104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882225.65134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882225.65177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882225.65198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882225.65276: variable 'omit' from source: magic vars 11728 1726882225.65425: variable 'omit' from source: magic vars 11728 1726882225.65500: variable 'network_connections' from source: task vars 11728 1726882225.65517: variable 'controller_profile' from source: play vars 11728 1726882225.65583: variable 'controller_profile' from source: play vars 11728 1726882225.65600: variable 'controller_device' from source: play vars 11728 1726882225.65667: variable 'controller_device' from source: play vars 11728 1726882225.65751: variable 'dhcp_interface1' from source: play vars 11728 1726882225.65755: variable 'dhcp_interface1' from source: play vars 11728 1726882225.65761: variable 'port1_profile' from source: play vars 11728 1726882225.65823: variable 'port1_profile' from source: play vars 11728 1726882225.65836: variable 'dhcp_interface1' from source: play vars 11728 1726882225.65902: variable 'dhcp_interface1' from source: play vars 11728 1726882225.65914: variable 'controller_profile' from source: play vars 11728 1726882225.65978: variable 'controller_profile' from source: play vars 11728 1726882225.65992: variable 'port2_profile' from source: play vars 11728 1726882225.66054: variable 'port2_profile' from source: play vars 11728 1726882225.66067: variable 'dhcp_interface2' from source: play vars 11728 1726882225.66131: variable 'dhcp_interface2' from source: play vars 11728 1726882225.66142: variable 'controller_profile' from source: play vars 11728 1726882225.66201: variable 'controller_profile' from source: play vars 11728 1726882225.66400: variable 'omit' from source: magic vars 11728 1726882225.66403: variable '__lsr_ansible_managed' from source: task vars 11728 1726882225.66465: variable '__lsr_ansible_managed' from source: task vars 11728 1726882225.66726: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11728 1726882225.66885: Loaded config def from plugin (lookup/template) 11728 1726882225.66897: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11728 1726882225.66929: File lookup term: get_ansible_managed.j2 11728 1726882225.66939: variable 'ansible_search_path' from source: unknown 11728 1726882225.66953: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11728 1726882225.66969: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11728 1726882225.66995: variable 'ansible_search_path' from source: unknown 11728 1726882225.80959: variable 'ansible_managed' from source: unknown 11728 1726882225.81384: variable 'omit' from source: magic vars 11728 1726882225.81388: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882225.81416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882225.81509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882225.81530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882225.81544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882225.81569: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882225.81608: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.81617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.81733: Set connection var ansible_connection to ssh 11728 1726882225.81750: Set connection var ansible_shell_executable to /bin/sh 11728 1726882225.81769: Set connection var ansible_timeout to 10 11728 1726882225.81776: Set connection var ansible_shell_type to sh 11728 1726882225.81789: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882225.81801: Set connection var ansible_pipelining to False 11728 1726882225.81834: variable 'ansible_shell_executable' from source: unknown 11728 1726882225.81841: variable 'ansible_connection' from source: unknown 11728 1726882225.81848: variable 'ansible_module_compression' from source: unknown 11728 1726882225.81856: variable 'ansible_shell_type' from source: unknown 11728 1726882225.81862: variable 'ansible_shell_executable' from source: unknown 11728 1726882225.81869: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882225.81876: variable 'ansible_pipelining' from source: unknown 11728 1726882225.81883: variable 'ansible_timeout' from source: unknown 11728 1726882225.81890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882225.82037: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882225.82068: variable 'omit' from source: magic vars 11728 1726882225.82071: starting attempt loop 11728 1726882225.82144: running the handler 11728 1726882225.82147: _low_level_execute_command(): starting 11728 1726882225.82149: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882225.82825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882225.82927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.82949: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882225.82964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.83027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.83263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.84948: stdout chunk (state=3): >>>/root <<< 11728 1726882225.85087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.85101: stdout chunk (state=3): >>><<< 11728 1726882225.85296: stderr chunk (state=3): >>><<< 11728 1726882225.85303: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882225.85306: _low_level_execute_command(): starting 11728 1726882225.85309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926 `" && echo ansible-tmp-1726882225.8521323-14227-21372904954926="` echo /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926 `" ) && sleep 0' 11728 1726882225.86237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882225.86240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882225.86243: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.86245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882225.86247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.86308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.86325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.86391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.88344: stdout chunk (state=3): >>>ansible-tmp-1726882225.8521323-14227-21372904954926=/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926 <<< 11728 1726882225.88409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.88412: stdout chunk (state=3): >>><<< 11728 1726882225.88488: stderr chunk (state=3): >>><<< 11728 1726882225.88700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882225.8521323-14227-21372904954926=/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882225.88772: variable 'ansible_module_compression' from source: unknown 11728 1726882225.88872: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11728 1726882225.88985: variable 'ansible_facts' from source: unknown 11728 1726882225.89248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py 11728 1726882225.89437: Sending initial data 11728 1726882225.89441: Sent initial data (167 bytes) 11728 1726882225.89998: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882225.90010: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882225.90021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882225.90038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882225.90050: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882225.90081: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882225.90087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882225.90165: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.90173: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.90250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.91871: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882225.91930: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882225.91999: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpf94puusg /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py <<< 11728 1726882225.92011: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py" <<< 11728 1726882225.92141: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpf94puusg" to remote "/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py" <<< 11728 1726882225.94304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.94308: stdout chunk (state=3): >>><<< 11728 1726882225.94311: stderr chunk (state=3): >>><<< 11728 1726882225.94321: done transferring module to remote 11728 1726882225.94373: _low_level_execute_command(): starting 11728 1726882225.94382: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/ /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py && sleep 0' 11728 1726882225.95436: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882225.95459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.95776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882225.95780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.95895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.95925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882225.97710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882225.97916: stderr chunk (state=3): >>><<< 11728 1726882225.97920: stdout chunk (state=3): >>><<< 11728 1726882225.97922: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882225.97925: _low_level_execute_command(): starting 11728 1726882225.97927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/AnsiballZ_network_connections.py && sleep 0' 11728 1726882225.99220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.99310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882225.99623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882225.99745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882225.99829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.42506: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed<<< 11728 1726882226.42660: stdout chunk (state=3): >>>\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11728 1726882226.44765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882226.44769: stdout chunk (state=3): >>><<< 11728 1726882226.44771: stderr chunk (state=3): >>><<< 11728 1726882226.44774: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "primary": "test1"}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882226.44776: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'arp_interval': 60, 'arp_ip_target': '192.0.2.128', 'arp_validate': 'none', 'primary': 'test1'}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882226.44783: _low_level_execute_command(): starting 11728 1726882226.44785: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882225.8521323-14227-21372904954926/ > /dev/null 2>&1 && sleep 0' 11728 1726882226.45312: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882226.45326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882226.45343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.45379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882226.45392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.45451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.47403: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882226.47407: stdout chunk (state=3): >>><<< 11728 1726882226.47409: stderr chunk (state=3): >>><<< 11728 1726882226.47467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882226.47471: handler run complete 11728 1726882226.47518: attempt loop complete, returning result 11728 1726882226.47521: _execute() done 11728 1726882226.47529: dumping result to json 11728 1726882226.47536: done dumping result, returning 11728 1726882226.47545: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-5c28-a762-000000000a3e] 11728 1726882226.47550: sending task result for task 12673a56-9f93-5c28-a762-000000000a3e 11728 1726882226.47663: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3e 11728 1726882226.47665: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active) 11728 1726882226.47805: no more pending results, returning what we have 11728 1726882226.47808: results queue empty 11728 1726882226.47809: checking for any_errors_fatal 11728 1726882226.47815: done checking for any_errors_fatal 11728 1726882226.47816: checking for max_fail_percentage 11728 1726882226.47817: done checking for max_fail_percentage 11728 1726882226.47818: checking to see if all hosts have failed and the running result is not ok 11728 1726882226.47819: done checking to see if all hosts have failed 11728 1726882226.47819: getting the remaining hosts for this loop 11728 1726882226.47821: done getting the remaining hosts for this loop 11728 1726882226.47824: getting the next task for host managed_node3 11728 1726882226.47830: done getting next task for host managed_node3 11728 1726882226.47833: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882226.47837: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882226.47853: getting variables 11728 1726882226.47855: in VariableManager get_vars() 11728 1726882226.47892: Calling all_inventory to load vars for managed_node3 11728 1726882226.47903: Calling groups_inventory to load vars for managed_node3 11728 1726882226.47906: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882226.47914: Calling all_plugins_play to load vars for managed_node3 11728 1726882226.47916: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882226.47919: Calling groups_plugins_play to load vars for managed_node3 11728 1726882226.48857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882226.49717: done with get_vars() 11728 1726882226.49732: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:30:26 -0400 (0:00:00.892) 0:00:51.350 ****** 11728 1726882226.49799: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882226.50034: worker is 1 (out of 1 available) 11728 1726882226.50048: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882226.50060: done queuing things up, now waiting for results queue to drain 11728 1726882226.50061: waiting for pending results... 11728 1726882226.50239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882226.50343: in run() - task 12673a56-9f93-5c28-a762-000000000a3f 11728 1726882226.50354: variable 'ansible_search_path' from source: unknown 11728 1726882226.50357: variable 'ansible_search_path' from source: unknown 11728 1726882226.50386: calling self._execute() 11728 1726882226.50458: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.50462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.50471: variable 'omit' from source: magic vars 11728 1726882226.50740: variable 'ansible_distribution_major_version' from source: facts 11728 1726882226.50749: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882226.50835: variable 'network_state' from source: role '' defaults 11728 1726882226.50846: Evaluated conditional (network_state != {}): False 11728 1726882226.50849: when evaluation is False, skipping this task 11728 1726882226.50852: _execute() done 11728 1726882226.50854: dumping result to json 11728 1726882226.50857: done dumping result, returning 11728 1726882226.50859: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-5c28-a762-000000000a3f] 11728 1726882226.50866: sending task result for task 12673a56-9f93-5c28-a762-000000000a3f 11728 1726882226.50950: done sending task result for task 12673a56-9f93-5c28-a762-000000000a3f 11728 1726882226.50953: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882226.51006: no more pending results, returning what we have 11728 1726882226.51011: results queue empty 11728 1726882226.51012: checking for any_errors_fatal 11728 1726882226.51026: done checking for any_errors_fatal 11728 1726882226.51026: checking for max_fail_percentage 11728 1726882226.51028: done checking for max_fail_percentage 11728 1726882226.51029: checking to see if all hosts have failed and the running result is not ok 11728 1726882226.51030: done checking to see if all hosts have failed 11728 1726882226.51030: getting the remaining hosts for this loop 11728 1726882226.51032: done getting the remaining hosts for this loop 11728 1726882226.51035: getting the next task for host managed_node3 11728 1726882226.51041: done getting next task for host managed_node3 11728 1726882226.51044: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882226.51049: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882226.51067: getting variables 11728 1726882226.51068: in VariableManager get_vars() 11728 1726882226.51107: Calling all_inventory to load vars for managed_node3 11728 1726882226.51109: Calling groups_inventory to load vars for managed_node3 11728 1726882226.51111: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882226.51119: Calling all_plugins_play to load vars for managed_node3 11728 1726882226.51122: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882226.51124: Calling groups_plugins_play to load vars for managed_node3 11728 1726882226.51868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882226.52839: done with get_vars() 11728 1726882226.52855: done getting variables 11728 1726882226.52901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:30:26 -0400 (0:00:00.031) 0:00:51.381 ****** 11728 1726882226.52927: entering _queue_task() for managed_node3/debug 11728 1726882226.53170: worker is 1 (out of 1 available) 11728 1726882226.53182: exiting _queue_task() for managed_node3/debug 11728 1726882226.53198: done queuing things up, now waiting for results queue to drain 11728 1726882226.53200: waiting for pending results... 11728 1726882226.53381: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882226.53479: in run() - task 12673a56-9f93-5c28-a762-000000000a40 11728 1726882226.53491: variable 'ansible_search_path' from source: unknown 11728 1726882226.53499: variable 'ansible_search_path' from source: unknown 11728 1726882226.53525: calling self._execute() 11728 1726882226.53602: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.53606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.53615: variable 'omit' from source: magic vars 11728 1726882226.53892: variable 'ansible_distribution_major_version' from source: facts 11728 1726882226.53905: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882226.53910: variable 'omit' from source: magic vars 11728 1726882226.53958: variable 'omit' from source: magic vars 11728 1726882226.53989: variable 'omit' from source: magic vars 11728 1726882226.54021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882226.54050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882226.54065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882226.54079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.54091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.54118: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882226.54121: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.54123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.54188: Set connection var ansible_connection to ssh 11728 1726882226.54200: Set connection var ansible_shell_executable to /bin/sh 11728 1726882226.54203: Set connection var ansible_timeout to 10 11728 1726882226.54212: Set connection var ansible_shell_type to sh 11728 1726882226.54214: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882226.54217: Set connection var ansible_pipelining to False 11728 1726882226.54236: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.54239: variable 'ansible_connection' from source: unknown 11728 1726882226.54242: variable 'ansible_module_compression' from source: unknown 11728 1726882226.54244: variable 'ansible_shell_type' from source: unknown 11728 1726882226.54246: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.54249: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.54251: variable 'ansible_pipelining' from source: unknown 11728 1726882226.54254: variable 'ansible_timeout' from source: unknown 11728 1726882226.54259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.54362: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882226.54371: variable 'omit' from source: magic vars 11728 1726882226.54376: starting attempt loop 11728 1726882226.54379: running the handler 11728 1726882226.54474: variable '__network_connections_result' from source: set_fact 11728 1726882226.54529: handler run complete 11728 1726882226.54545: attempt loop complete, returning result 11728 1726882226.54548: _execute() done 11728 1726882226.54551: dumping result to json 11728 1726882226.54553: done dumping result, returning 11728 1726882226.54562: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-5c28-a762-000000000a40] 11728 1726882226.54567: sending task result for task 12673a56-9f93-5c28-a762-000000000a40 11728 1726882226.54654: done sending task result for task 12673a56-9f93-5c28-a762-000000000a40 11728 1726882226.54657: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)" ] } 11728 1726882226.54728: no more pending results, returning what we have 11728 1726882226.54733: results queue empty 11728 1726882226.54734: checking for any_errors_fatal 11728 1726882226.54740: done checking for any_errors_fatal 11728 1726882226.54741: checking for max_fail_percentage 11728 1726882226.54743: done checking for max_fail_percentage 11728 1726882226.54744: checking to see if all hosts have failed and the running result is not ok 11728 1726882226.54744: done checking to see if all hosts have failed 11728 1726882226.54745: getting the remaining hosts for this loop 11728 1726882226.54747: done getting the remaining hosts for this loop 11728 1726882226.54750: getting the next task for host managed_node3 11728 1726882226.54756: done getting next task for host managed_node3 11728 1726882226.54759: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882226.54764: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882226.54775: getting variables 11728 1726882226.54776: in VariableManager get_vars() 11728 1726882226.54817: Calling all_inventory to load vars for managed_node3 11728 1726882226.54820: Calling groups_inventory to load vars for managed_node3 11728 1726882226.54823: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882226.54831: Calling all_plugins_play to load vars for managed_node3 11728 1726882226.54841: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882226.54844: Calling groups_plugins_play to load vars for managed_node3 11728 1726882226.59108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882226.59951: done with get_vars() 11728 1726882226.59967: done getting variables 11728 1726882226.60006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:30:26 -0400 (0:00:00.071) 0:00:51.452 ****** 11728 1726882226.60029: entering _queue_task() for managed_node3/debug 11728 1726882226.60296: worker is 1 (out of 1 available) 11728 1726882226.60310: exiting _queue_task() for managed_node3/debug 11728 1726882226.60323: done queuing things up, now waiting for results queue to drain 11728 1726882226.60325: waiting for pending results... 11728 1726882226.60511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882226.60624: in run() - task 12673a56-9f93-5c28-a762-000000000a41 11728 1726882226.60635: variable 'ansible_search_path' from source: unknown 11728 1726882226.60639: variable 'ansible_search_path' from source: unknown 11728 1726882226.60671: calling self._execute() 11728 1726882226.60742: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.60746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.60755: variable 'omit' from source: magic vars 11728 1726882226.61038: variable 'ansible_distribution_major_version' from source: facts 11728 1726882226.61048: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882226.61054: variable 'omit' from source: magic vars 11728 1726882226.61106: variable 'omit' from source: magic vars 11728 1726882226.61131: variable 'omit' from source: magic vars 11728 1726882226.61163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882226.61190: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882226.61213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882226.61226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.61237: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.61260: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882226.61264: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.61266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.61336: Set connection var ansible_connection to ssh 11728 1726882226.61345: Set connection var ansible_shell_executable to /bin/sh 11728 1726882226.61349: Set connection var ansible_timeout to 10 11728 1726882226.61353: Set connection var ansible_shell_type to sh 11728 1726882226.61362: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882226.61366: Set connection var ansible_pipelining to False 11728 1726882226.61384: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.61387: variable 'ansible_connection' from source: unknown 11728 1726882226.61390: variable 'ansible_module_compression' from source: unknown 11728 1726882226.61397: variable 'ansible_shell_type' from source: unknown 11728 1726882226.61400: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.61402: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.61404: variable 'ansible_pipelining' from source: unknown 11728 1726882226.61407: variable 'ansible_timeout' from source: unknown 11728 1726882226.61409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.61507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882226.61516: variable 'omit' from source: magic vars 11728 1726882226.61521: starting attempt loop 11728 1726882226.61524: running the handler 11728 1726882226.61563: variable '__network_connections_result' from source: set_fact 11728 1726882226.61619: variable '__network_connections_result' from source: set_fact 11728 1726882226.61740: handler run complete 11728 1726882226.61764: attempt loop complete, returning result 11728 1726882226.61768: _execute() done 11728 1726882226.61770: dumping result to json 11728 1726882226.61775: done dumping result, returning 11728 1726882226.61783: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-5c28-a762-000000000a41] 11728 1726882226.61789: sending task result for task 12673a56-9f93-5c28-a762-000000000a41 11728 1726882226.61880: done sending task result for task 12673a56-9f93-5c28-a762-000000000a41 11728 1726882226.61882: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)" ] } } 11728 1726882226.61990: no more pending results, returning what we have 11728 1726882226.61998: results queue empty 11728 1726882226.61999: checking for any_errors_fatal 11728 1726882226.62006: done checking for any_errors_fatal 11728 1726882226.62006: checking for max_fail_percentage 11728 1726882226.62008: done checking for max_fail_percentage 11728 1726882226.62009: checking to see if all hosts have failed and the running result is not ok 11728 1726882226.62009: done checking to see if all hosts have failed 11728 1726882226.62010: getting the remaining hosts for this loop 11728 1726882226.62011: done getting the remaining hosts for this loop 11728 1726882226.62015: getting the next task for host managed_node3 11728 1726882226.62021: done getting next task for host managed_node3 11728 1726882226.62025: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882226.62029: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882226.62039: getting variables 11728 1726882226.62040: in VariableManager get_vars() 11728 1726882226.62072: Calling all_inventory to load vars for managed_node3 11728 1726882226.62074: Calling groups_inventory to load vars for managed_node3 11728 1726882226.62076: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882226.62083: Calling all_plugins_play to load vars for managed_node3 11728 1726882226.62086: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882226.62088: Calling groups_plugins_play to load vars for managed_node3 11728 1726882226.62834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882226.63795: done with get_vars() 11728 1726882226.63818: done getting variables 11728 1726882226.63861: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:30:26 -0400 (0:00:00.038) 0:00:51.491 ****** 11728 1726882226.63886: entering _queue_task() for managed_node3/debug 11728 1726882226.64118: worker is 1 (out of 1 available) 11728 1726882226.64131: exiting _queue_task() for managed_node3/debug 11728 1726882226.64144: done queuing things up, now waiting for results queue to drain 11728 1726882226.64145: waiting for pending results... 11728 1726882226.64321: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882226.64433: in run() - task 12673a56-9f93-5c28-a762-000000000a42 11728 1726882226.64443: variable 'ansible_search_path' from source: unknown 11728 1726882226.64447: variable 'ansible_search_path' from source: unknown 11728 1726882226.64475: calling self._execute() 11728 1726882226.64544: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.64549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.64557: variable 'omit' from source: magic vars 11728 1726882226.64829: variable 'ansible_distribution_major_version' from source: facts 11728 1726882226.64838: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882226.64925: variable 'network_state' from source: role '' defaults 11728 1726882226.64934: Evaluated conditional (network_state != {}): False 11728 1726882226.64938: when evaluation is False, skipping this task 11728 1726882226.64941: _execute() done 11728 1726882226.64943: dumping result to json 11728 1726882226.64946: done dumping result, returning 11728 1726882226.64952: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-5c28-a762-000000000a42] 11728 1726882226.64957: sending task result for task 12673a56-9f93-5c28-a762-000000000a42 11728 1726882226.65061: done sending task result for task 12673a56-9f93-5c28-a762-000000000a42 11728 1726882226.65064: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11728 1726882226.65113: no more pending results, returning what we have 11728 1726882226.65117: results queue empty 11728 1726882226.65118: checking for any_errors_fatal 11728 1726882226.65130: done checking for any_errors_fatal 11728 1726882226.65130: checking for max_fail_percentage 11728 1726882226.65132: done checking for max_fail_percentage 11728 1726882226.65133: checking to see if all hosts have failed and the running result is not ok 11728 1726882226.65134: done checking to see if all hosts have failed 11728 1726882226.65135: getting the remaining hosts for this loop 11728 1726882226.65137: done getting the remaining hosts for this loop 11728 1726882226.65140: getting the next task for host managed_node3 11728 1726882226.65147: done getting next task for host managed_node3 11728 1726882226.65152: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882226.65158: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882226.65181: getting variables 11728 1726882226.65183: in VariableManager get_vars() 11728 1726882226.65237: Calling all_inventory to load vars for managed_node3 11728 1726882226.65240: Calling groups_inventory to load vars for managed_node3 11728 1726882226.65242: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882226.65254: Calling all_plugins_play to load vars for managed_node3 11728 1726882226.65258: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882226.65261: Calling groups_plugins_play to load vars for managed_node3 11728 1726882226.66520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882226.67396: done with get_vars() 11728 1726882226.67411: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:30:26 -0400 (0:00:00.035) 0:00:51.527 ****** 11728 1726882226.67479: entering _queue_task() for managed_node3/ping 11728 1726882226.67707: worker is 1 (out of 1 available) 11728 1726882226.67722: exiting _queue_task() for managed_node3/ping 11728 1726882226.67734: done queuing things up, now waiting for results queue to drain 11728 1726882226.67735: waiting for pending results... 11728 1726882226.67923: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882226.68033: in run() - task 12673a56-9f93-5c28-a762-000000000a43 11728 1726882226.68044: variable 'ansible_search_path' from source: unknown 11728 1726882226.68047: variable 'ansible_search_path' from source: unknown 11728 1726882226.68078: calling self._execute() 11728 1726882226.68300: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.68304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.68307: variable 'omit' from source: magic vars 11728 1726882226.68565: variable 'ansible_distribution_major_version' from source: facts 11728 1726882226.68583: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882226.68598: variable 'omit' from source: magic vars 11728 1726882226.68667: variable 'omit' from source: magic vars 11728 1726882226.68703: variable 'omit' from source: magic vars 11728 1726882226.68749: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882226.68788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882226.68815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882226.68836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.68850: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882226.68881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882226.68888: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.68899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.68999: Set connection var ansible_connection to ssh 11728 1726882226.69017: Set connection var ansible_shell_executable to /bin/sh 11728 1726882226.69026: Set connection var ansible_timeout to 10 11728 1726882226.69033: Set connection var ansible_shell_type to sh 11728 1726882226.69043: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882226.69050: Set connection var ansible_pipelining to False 11728 1726882226.69073: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.69079: variable 'ansible_connection' from source: unknown 11728 1726882226.69085: variable 'ansible_module_compression' from source: unknown 11728 1726882226.69090: variable 'ansible_shell_type' from source: unknown 11728 1726882226.69100: variable 'ansible_shell_executable' from source: unknown 11728 1726882226.69106: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882226.69114: variable 'ansible_pipelining' from source: unknown 11728 1726882226.69119: variable 'ansible_timeout' from source: unknown 11728 1726882226.69126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882226.69318: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882226.69485: variable 'omit' from source: magic vars 11728 1726882226.69488: starting attempt loop 11728 1726882226.69490: running the handler 11728 1726882226.69496: _low_level_execute_command(): starting 11728 1726882226.69499: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882226.70075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882226.70088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882226.70200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.70226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.70311: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.71950: stdout chunk (state=3): >>>/root <<< 11728 1726882226.72088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882226.72110: stdout chunk (state=3): >>><<< 11728 1726882226.72131: stderr chunk (state=3): >>><<< 11728 1726882226.72174: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882226.72199: _low_level_execute_command(): starting 11728 1726882226.72216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139 `" && echo ansible-tmp-1726882226.7218153-14276-47830272413139="` echo /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139 `" ) && sleep 0' 11728 1726882226.73144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882226.73147: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882226.73150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.73152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882226.73162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.73206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.73233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.73306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.75196: stdout chunk (state=3): >>>ansible-tmp-1726882226.7218153-14276-47830272413139=/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139 <<< 11728 1726882226.75359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882226.75369: stdout chunk (state=3): >>><<< 11728 1726882226.75398: stderr chunk (state=3): >>><<< 11728 1726882226.75600: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882226.7218153-14276-47830272413139=/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882226.75603: variable 'ansible_module_compression' from source: unknown 11728 1726882226.75606: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11728 1726882226.75608: variable 'ansible_facts' from source: unknown 11728 1726882226.75671: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py 11728 1726882226.75922: Sending initial data 11728 1726882226.75925: Sent initial data (152 bytes) 11728 1726882226.76510: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.76539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882226.76553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.76568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.76670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.78171: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882226.78177: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882226.78210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882226.78272: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmps7yf38t1 /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py <<< 11728 1726882226.78274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py" <<< 11728 1726882226.78317: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmps7yf38t1" to remote "/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py" <<< 11728 1726882226.78853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882226.78878: stderr chunk (state=3): >>><<< 11728 1726882226.78881: stdout chunk (state=3): >>><<< 11728 1726882226.78903: done transferring module to remote 11728 1726882226.78912: _low_level_execute_command(): starting 11728 1726882226.78915: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/ /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py && sleep 0' 11728 1726882226.79512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.79531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882226.79541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.79558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.79633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.81319: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882226.81345: stderr chunk (state=3): >>><<< 11728 1726882226.81352: stdout chunk (state=3): >>><<< 11728 1726882226.81364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882226.81367: _low_level_execute_command(): starting 11728 1726882226.81372: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/AnsiballZ_ping.py && sleep 0' 11728 1726882226.81785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882226.81788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.81790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882226.81792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.81844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.81851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.81939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882226.97427: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11728 1726882226.98742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882226.98770: stderr chunk (state=3): >>><<< 11728 1726882226.98773: stdout chunk (state=3): >>><<< 11728 1726882226.98792: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882226.98817: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882226.98826: _low_level_execute_command(): starting 11728 1726882226.98830: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882226.7218153-14276-47830272413139/ > /dev/null 2>&1 && sleep 0' 11728 1726882226.99284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882226.99287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882226.99289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.99291: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882226.99295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882226.99338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882226.99357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882226.99406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.01219: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.01243: stderr chunk (state=3): >>><<< 11728 1726882227.01248: stdout chunk (state=3): >>><<< 11728 1726882227.01261: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.01269: handler run complete 11728 1726882227.01280: attempt loop complete, returning result 11728 1726882227.01283: _execute() done 11728 1726882227.01286: dumping result to json 11728 1726882227.01290: done dumping result, returning 11728 1726882227.01303: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-5c28-a762-000000000a43] 11728 1726882227.01308: sending task result for task 12673a56-9f93-5c28-a762-000000000a43 11728 1726882227.01395: done sending task result for task 12673a56-9f93-5c28-a762-000000000a43 11728 1726882227.01398: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11728 1726882227.01458: no more pending results, returning what we have 11728 1726882227.01463: results queue empty 11728 1726882227.01464: checking for any_errors_fatal 11728 1726882227.01470: done checking for any_errors_fatal 11728 1726882227.01471: checking for max_fail_percentage 11728 1726882227.01473: done checking for max_fail_percentage 11728 1726882227.01474: checking to see if all hosts have failed and the running result is not ok 11728 1726882227.01474: done checking to see if all hosts have failed 11728 1726882227.01475: getting the remaining hosts for this loop 11728 1726882227.01477: done getting the remaining hosts for this loop 11728 1726882227.01479: getting the next task for host managed_node3 11728 1726882227.01490: done getting next task for host managed_node3 11728 1726882227.01492: ^ task is: TASK: meta (role_complete) 11728 1726882227.01502: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882227.01519: getting variables 11728 1726882227.01520: in VariableManager get_vars() 11728 1726882227.01565: Calling all_inventory to load vars for managed_node3 11728 1726882227.01568: Calling groups_inventory to load vars for managed_node3 11728 1726882227.01570: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882227.01578: Calling all_plugins_play to load vars for managed_node3 11728 1726882227.01581: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882227.01583: Calling groups_plugins_play to load vars for managed_node3 11728 1726882227.02522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882227.03365: done with get_vars() 11728 1726882227.03380: done getting variables 11728 1726882227.03439: done queuing things up, now waiting for results queue to drain 11728 1726882227.03440: results queue empty 11728 1726882227.03441: checking for any_errors_fatal 11728 1726882227.03443: done checking for any_errors_fatal 11728 1726882227.03443: checking for max_fail_percentage 11728 1726882227.03444: done checking for max_fail_percentage 11728 1726882227.03444: checking to see if all hosts have failed and the running result is not ok 11728 1726882227.03445: done checking to see if all hosts have failed 11728 1726882227.03445: getting the remaining hosts for this loop 11728 1726882227.03446: done getting the remaining hosts for this loop 11728 1726882227.03447: getting the next task for host managed_node3 11728 1726882227.03451: done getting next task for host managed_node3 11728 1726882227.03453: ^ task is: TASK: Show result 11728 1726882227.03455: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882227.03457: getting variables 11728 1726882227.03458: in VariableManager get_vars() 11728 1726882227.03468: Calling all_inventory to load vars for managed_node3 11728 1726882227.03469: Calling groups_inventory to load vars for managed_node3 11728 1726882227.03470: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882227.03474: Calling all_plugins_play to load vars for managed_node3 11728 1726882227.03475: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882227.03477: Calling groups_plugins_play to load vars for managed_node3 11728 1726882227.04088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882227.04959: done with get_vars() 11728 1726882227.04973: done getting variables 11728 1726882227.05006: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show result] ************************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_bond_profile_reconfigure.yml:33 Friday 20 September 2024 21:30:27 -0400 (0:00:00.375) 0:00:51.902 ****** 11728 1726882227.05028: entering _queue_task() for managed_node3/debug 11728 1726882227.05277: worker is 1 (out of 1 available) 11728 1726882227.05290: exiting _queue_task() for managed_node3/debug 11728 1726882227.05305: done queuing things up, now waiting for results queue to drain 11728 1726882227.05307: waiting for pending results... 11728 1726882227.05490: running TaskExecutor() for managed_node3/TASK: Show result 11728 1726882227.05567: in run() - task 12673a56-9f93-5c28-a762-000000000a73 11728 1726882227.05580: variable 'ansible_search_path' from source: unknown 11728 1726882227.05584: variable 'ansible_search_path' from source: unknown 11728 1726882227.05616: calling self._execute() 11728 1726882227.05692: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.05700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.05709: variable 'omit' from source: magic vars 11728 1726882227.05989: variable 'ansible_distribution_major_version' from source: facts 11728 1726882227.06003: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882227.06009: variable 'omit' from source: magic vars 11728 1726882227.06024: variable 'omit' from source: magic vars 11728 1726882227.06046: variable 'omit' from source: magic vars 11728 1726882227.06081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882227.06113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882227.06129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882227.06143: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.06153: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.06210: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882227.06398: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.06401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.06403: Set connection var ansible_connection to ssh 11728 1726882227.06406: Set connection var ansible_shell_executable to /bin/sh 11728 1726882227.06408: Set connection var ansible_timeout to 10 11728 1726882227.06410: Set connection var ansible_shell_type to sh 11728 1726882227.06412: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882227.06414: Set connection var ansible_pipelining to False 11728 1726882227.06417: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.06419: variable 'ansible_connection' from source: unknown 11728 1726882227.06423: variable 'ansible_module_compression' from source: unknown 11728 1726882227.06425: variable 'ansible_shell_type' from source: unknown 11728 1726882227.06427: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.06428: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.06430: variable 'ansible_pipelining' from source: unknown 11728 1726882227.06432: variable 'ansible_timeout' from source: unknown 11728 1726882227.06434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.06590: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882227.06611: variable 'omit' from source: magic vars 11728 1726882227.06621: starting attempt loop 11728 1726882227.06627: running the handler 11728 1726882227.06684: variable '__network_connections_result' from source: set_fact 11728 1726882227.06800: variable '__network_connections_result' from source: set_fact 11728 1726882227.07008: handler run complete 11728 1726882227.07045: attempt loop complete, returning result 11728 1726882227.07052: _execute() done 11728 1726882227.07059: dumping result to json 11728 1726882227.07068: done dumping result, returning 11728 1726882227.07081: done running TaskExecutor() for managed_node3/TASK: Show result [12673a56-9f93-5c28-a762-000000000a73] 11728 1726882227.07201: sending task result for task 12673a56-9f93-5c28-a762-000000000a73 11728 1726882227.07280: done sending task result for task 12673a56-9f93-5c28-a762-000000000a73 11728 1726882227.07283: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "arp_interval": 60, "arp_ip_target": "192.0.2.128", "arp_validate": "none", "mode": "active-backup", "primary": "test1" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 2dd6ee50-995a-4f49-bd7b-b3c1e472ace8 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 5920b9c7-d6f0-4518-a9a4-f38b43b06206 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 8458f044-ac0a-4f70-866e-afbf375585aa (not-active)" ] } } 11728 1726882227.07514: no more pending results, returning what we have 11728 1726882227.07518: results queue empty 11728 1726882227.07525: checking for any_errors_fatal 11728 1726882227.07527: done checking for any_errors_fatal 11728 1726882227.07528: checking for max_fail_percentage 11728 1726882227.07530: done checking for max_fail_percentage 11728 1726882227.07531: checking to see if all hosts have failed and the running result is not ok 11728 1726882227.07532: done checking to see if all hosts have failed 11728 1726882227.07533: getting the remaining hosts for this loop 11728 1726882227.07535: done getting the remaining hosts for this loop 11728 1726882227.07539: getting the next task for host managed_node3 11728 1726882227.07549: done getting next task for host managed_node3 11728 1726882227.07554: ^ task is: TASK: Asserts 11728 1726882227.07557: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882227.07563: getting variables 11728 1726882227.07564: in VariableManager get_vars() 11728 1726882227.07727: Calling all_inventory to load vars for managed_node3 11728 1726882227.07731: Calling groups_inventory to load vars for managed_node3 11728 1726882227.07734: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882227.07744: Calling all_plugins_play to load vars for managed_node3 11728 1726882227.07748: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882227.07751: Calling groups_plugins_play to load vars for managed_node3 11728 1726882227.09472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882227.12147: done with get_vars() 11728 1726882227.12179: done getting variables TASK [Asserts] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:36 Friday 20 September 2024 21:30:27 -0400 (0:00:00.074) 0:00:51.977 ****** 11728 1726882227.12474: entering _queue_task() for managed_node3/include_tasks 11728 1726882227.12837: worker is 1 (out of 1 available) 11728 1726882227.12848: exiting _queue_task() for managed_node3/include_tasks 11728 1726882227.12859: done queuing things up, now waiting for results queue to drain 11728 1726882227.12860: waiting for pending results... 11728 1726882227.13212: running TaskExecutor() for managed_node3/TASK: Asserts 11728 1726882227.13310: in run() - task 12673a56-9f93-5c28-a762-0000000008ef 11728 1726882227.13315: variable 'ansible_search_path' from source: unknown 11728 1726882227.13317: variable 'ansible_search_path' from source: unknown 11728 1726882227.13320: variable 'lsr_assert' from source: include params 11728 1726882227.13525: variable 'lsr_assert' from source: include params 11728 1726882227.13601: variable 'omit' from source: magic vars 11728 1726882227.13765: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.13780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.13795: variable 'omit' from source: magic vars 11728 1726882227.14051: variable 'ansible_distribution_major_version' from source: facts 11728 1726882227.14070: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882227.14085: variable 'item' from source: unknown 11728 1726882227.14178: variable 'item' from source: unknown 11728 1726882227.14200: variable 'item' from source: unknown 11728 1726882227.14262: variable 'item' from source: unknown 11728 1726882227.14705: dumping result to json 11728 1726882227.14708: done dumping result, returning 11728 1726882227.14711: done running TaskExecutor() for managed_node3/TASK: Asserts [12673a56-9f93-5c28-a762-0000000008ef] 11728 1726882227.14713: sending task result for task 12673a56-9f93-5c28-a762-0000000008ef 11728 1726882227.14758: done sending task result for task 12673a56-9f93-5c28-a762-0000000008ef 11728 1726882227.14761: WORKER PROCESS EXITING 11728 1726882227.14827: no more pending results, returning what we have 11728 1726882227.14831: in VariableManager get_vars() 11728 1726882227.14870: Calling all_inventory to load vars for managed_node3 11728 1726882227.14872: Calling groups_inventory to load vars for managed_node3 11728 1726882227.14874: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882227.14883: Calling all_plugins_play to load vars for managed_node3 11728 1726882227.14886: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882227.14888: Calling groups_plugins_play to load vars for managed_node3 11728 1726882227.16287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882227.17986: done with get_vars() 11728 1726882227.18007: variable 'ansible_search_path' from source: unknown 11728 1726882227.18008: variable 'ansible_search_path' from source: unknown 11728 1726882227.18056: we have included files to process 11728 1726882227.18057: generating all_blocks data 11728 1726882227.18060: done generating all_blocks data 11728 1726882227.18065: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882227.18066: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882227.18068: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml 11728 1726882227.18332: in VariableManager get_vars() 11728 1726882227.18363: done with get_vars() 11728 1726882227.18409: in VariableManager get_vars() 11728 1726882227.18434: done with get_vars() 11728 1726882227.18447: done processing included file 11728 1726882227.18449: iterating over new_blocks loaded from include file 11728 1726882227.18450: in VariableManager get_vars() 11728 1726882227.18473: done with get_vars() 11728 1726882227.18475: filtering new block on tags 11728 1726882227.18520: done filtering new block on tags 11728 1726882227.18522: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml for managed_node3 => (item=tasks/assert_bond_options.yml) 11728 1726882227.18527: extending task lists for all hosts with included blocks 11728 1726882227.21730: done extending task lists 11728 1726882227.21732: done processing included files 11728 1726882227.21733: results queue empty 11728 1726882227.21733: checking for any_errors_fatal 11728 1726882227.21739: done checking for any_errors_fatal 11728 1726882227.21740: checking for max_fail_percentage 11728 1726882227.21741: done checking for max_fail_percentage 11728 1726882227.21742: checking to see if all hosts have failed and the running result is not ok 11728 1726882227.21743: done checking to see if all hosts have failed 11728 1726882227.21744: getting the remaining hosts for this loop 11728 1726882227.21746: done getting the remaining hosts for this loop 11728 1726882227.21748: getting the next task for host managed_node3 11728 1726882227.21753: done getting next task for host managed_node3 11728 1726882227.21755: ^ task is: TASK: ** TEST check bond settings 11728 1726882227.21758: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882227.21761: getting variables 11728 1726882227.21762: in VariableManager get_vars() 11728 1726882227.21781: Calling all_inventory to load vars for managed_node3 11728 1726882227.21783: Calling groups_inventory to load vars for managed_node3 11728 1726882227.21785: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882227.21792: Calling all_plugins_play to load vars for managed_node3 11728 1726882227.21796: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882227.21799: Calling groups_plugins_play to load vars for managed_node3 11728 1726882227.22980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882227.24583: done with get_vars() 11728 1726882227.24610: done getting variables 11728 1726882227.24661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check bond settings] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Friday 20 September 2024 21:30:27 -0400 (0:00:00.122) 0:00:52.099 ****** 11728 1726882227.24692: entering _queue_task() for managed_node3/command 11728 1726882227.25058: worker is 1 (out of 1 available) 11728 1726882227.25068: exiting _queue_task() for managed_node3/command 11728 1726882227.25085: done queuing things up, now waiting for results queue to drain 11728 1726882227.25086: waiting for pending results... 11728 1726882227.25383: running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings 11728 1726882227.25522: in run() - task 12673a56-9f93-5c28-a762-000000000c2a 11728 1726882227.25527: variable 'ansible_search_path' from source: unknown 11728 1726882227.25630: variable 'ansible_search_path' from source: unknown 11728 1726882227.25634: variable 'bond_options_to_assert' from source: set_fact 11728 1726882227.25804: variable 'bond_options_to_assert' from source: set_fact 11728 1726882227.25923: variable 'omit' from source: magic vars 11728 1726882227.26078: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.26092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.26110: variable 'omit' from source: magic vars 11728 1726882227.26364: variable 'ansible_distribution_major_version' from source: facts 11728 1726882227.26379: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882227.26398: variable 'omit' from source: magic vars 11728 1726882227.26442: variable 'omit' from source: magic vars 11728 1726882227.26631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882227.29138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882227.29219: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882227.29299: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882227.29306: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882227.29341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882227.29445: variable 'controller_device' from source: play vars 11728 1726882227.29464: variable 'bond_opt' from source: unknown 11728 1726882227.29699: variable 'omit' from source: magic vars 11728 1726882227.29703: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882227.29706: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882227.29709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882227.29711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.29713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.29715: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882227.29718: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.29720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.29756: Set connection var ansible_connection to ssh 11728 1726882227.29770: Set connection var ansible_shell_executable to /bin/sh 11728 1726882227.29780: Set connection var ansible_timeout to 10 11728 1726882227.29786: Set connection var ansible_shell_type to sh 11728 1726882227.29800: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882227.29809: Set connection var ansible_pipelining to False 11728 1726882227.29844: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.29852: variable 'ansible_connection' from source: unknown 11728 1726882227.29858: variable 'ansible_module_compression' from source: unknown 11728 1726882227.29864: variable 'ansible_shell_type' from source: unknown 11728 1726882227.29870: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.29876: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.29883: variable 'ansible_pipelining' from source: unknown 11728 1726882227.29889: variable 'ansible_timeout' from source: unknown 11728 1726882227.29899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.30008: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882227.30025: variable 'omit' from source: magic vars 11728 1726882227.30034: starting attempt loop 11728 1726882227.30041: running the handler 11728 1726882227.30067: _low_level_execute_command(): starting 11728 1726882227.30077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882227.30789: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.30829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.30935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.30985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.31047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.32999: stdout chunk (state=3): >>>/root <<< 11728 1726882227.33166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.33169: stdout chunk (state=3): >>><<< 11728 1726882227.33172: stderr chunk (state=3): >>><<< 11728 1726882227.33198: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.33315: _low_level_execute_command(): starting 11728 1726882227.33320: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130 `" && echo ansible-tmp-1726882227.3321354-14304-178285474736130="` echo /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130 `" ) && sleep 0' 11728 1726882227.33879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.33892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.33908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.33926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.33949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882227.34051: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.34069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.34154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.36064: stdout chunk (state=3): >>>ansible-tmp-1726882227.3321354-14304-178285474736130=/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130 <<< 11728 1726882227.36399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.36403: stdout chunk (state=3): >>><<< 11728 1726882227.36406: stderr chunk (state=3): >>><<< 11728 1726882227.36409: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882227.3321354-14304-178285474736130=/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.36411: variable 'ansible_module_compression' from source: unknown 11728 1726882227.36414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882227.36416: variable 'ansible_facts' from source: unknown 11728 1726882227.36521: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py 11728 1726882227.36667: Sending initial data 11728 1726882227.36671: Sent initial data (156 bytes) 11728 1726882227.37186: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.37192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.37213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.37228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.37309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.37328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882227.37339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.37348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.37429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.39201: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882227.39221: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882227.39367: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882227.39431: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp8xh2b_ro /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py <<< 11728 1726882227.39435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py" <<< 11728 1726882227.39483: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp8xh2b_ro" to remote "/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py" <<< 11728 1726882227.40268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.40385: stderr chunk (state=3): >>><<< 11728 1726882227.40388: stdout chunk (state=3): >>><<< 11728 1726882227.40417: done transferring module to remote 11728 1726882227.40434: _low_level_execute_command(): starting 11728 1726882227.40481: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/ /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py && sleep 0' 11728 1726882227.41531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.41600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882227.41629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.41669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.41714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.43485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.43488: stdout chunk (state=3): >>><<< 11728 1726882227.43490: stderr chunk (state=3): >>><<< 11728 1726882227.43592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.43601: _low_level_execute_command(): starting 11728 1726882227.43605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/AnsiballZ_command.py && sleep 0' 11728 1726882227.44272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.44287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.44309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.44328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.44358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.44460: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.44490: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.44591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.60101: stdout chunk (state=3): >>> {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:30:27.595906", "end": "2024-09-20 21:30:27.599174", "delta": "0:00:00.003268", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882227.61707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882227.61711: stdout chunk (state=3): >>><<< 11728 1726882227.61714: stderr chunk (state=3): >>><<< 11728 1726882227.61717: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "active-backup 1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/mode"], "start": "2024-09-20 21:30:27.595906", "end": "2024-09-20 21:30:27.599174", "delta": "0:00:00.003268", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/mode", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882227.61719: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/mode', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882227.61727: _low_level_execute_command(): starting 11728 1726882227.61729: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882227.3321354-14304-178285474736130/ > /dev/null 2>&1 && sleep 0' 11728 1726882227.62199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.62203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.62231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.62234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.62237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.62282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882227.62286: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.62341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.64344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.64360: stderr chunk (state=3): >>><<< 11728 1726882227.64374: stdout chunk (state=3): >>><<< 11728 1726882227.64385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.64391: handler run complete 11728 1726882227.64412: Evaluated conditional (False): False 11728 1726882227.64531: variable 'bond_opt' from source: unknown 11728 1726882227.64536: variable 'result' from source: set_fact 11728 1726882227.64548: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882227.64557: attempt loop complete, returning result 11728 1726882227.64571: variable 'bond_opt' from source: unknown 11728 1726882227.64625: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'mode', 'value': 'active-backup'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "mode", "value": "active-backup" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/mode" ], "delta": "0:00:00.003268", "end": "2024-09-20 21:30:27.599174", "rc": 0, "start": "2024-09-20 21:30:27.595906" } STDOUT: active-backup 1 11728 1726882227.64831: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.64834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.64836: variable 'omit' from source: magic vars 11728 1726882227.64874: variable 'ansible_distribution_major_version' from source: facts 11728 1726882227.64878: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882227.64882: variable 'omit' from source: magic vars 11728 1726882227.64897: variable 'omit' from source: magic vars 11728 1726882227.65006: variable 'controller_device' from source: play vars 11728 1726882227.65009: variable 'bond_opt' from source: unknown 11728 1726882227.65030: variable 'omit' from source: magic vars 11728 1726882227.65042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882227.65049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.65055: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.65066: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882227.65069: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.65071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.65120: Set connection var ansible_connection to ssh 11728 1726882227.65128: Set connection var ansible_shell_executable to /bin/sh 11728 1726882227.65135: Set connection var ansible_timeout to 10 11728 1726882227.65138: Set connection var ansible_shell_type to sh 11728 1726882227.65143: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882227.65147: Set connection var ansible_pipelining to False 11728 1726882227.65163: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.65166: variable 'ansible_connection' from source: unknown 11728 1726882227.65169: variable 'ansible_module_compression' from source: unknown 11728 1726882227.65171: variable 'ansible_shell_type' from source: unknown 11728 1726882227.65173: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.65175: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.65180: variable 'ansible_pipelining' from source: unknown 11728 1726882227.65182: variable 'ansible_timeout' from source: unknown 11728 1726882227.65186: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.65253: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882227.65259: variable 'omit' from source: magic vars 11728 1726882227.65262: starting attempt loop 11728 1726882227.65264: running the handler 11728 1726882227.65272: _low_level_execute_command(): starting 11728 1726882227.65275: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882227.65668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.65706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882227.65709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882227.65716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882227.65718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.65720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.65756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.65760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.65826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.67378: stdout chunk (state=3): >>>/root <<< 11728 1726882227.67478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.67501: stderr chunk (state=3): >>><<< 11728 1726882227.67504: stdout chunk (state=3): >>><<< 11728 1726882227.67513: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.67521: _low_level_execute_command(): starting 11728 1726882227.67526: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586 `" && echo ansible-tmp-1726882227.6751344-14304-255686919835586="` echo /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586 `" ) && sleep 0' 11728 1726882227.67931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.67935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882227.67937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.67939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.67941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.67976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.67987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.68045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.69929: stdout chunk (state=3): >>>ansible-tmp-1726882227.6751344-14304-255686919835586=/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586 <<< 11728 1726882227.69992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.70037: stderr chunk (state=3): >>><<< 11728 1726882227.70040: stdout chunk (state=3): >>><<< 11728 1726882227.70073: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882227.6751344-14304-255686919835586=/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.70096: variable 'ansible_module_compression' from source: unknown 11728 1726882227.70128: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882227.70143: variable 'ansible_facts' from source: unknown 11728 1726882227.70200: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py 11728 1726882227.70291: Sending initial data 11728 1726882227.70299: Sent initial data (156 bytes) 11728 1726882227.70801: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.70845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.70885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.72432: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882227.72474: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882227.72522: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp5jb2rmnv /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py <<< 11728 1726882227.72525: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py" <<< 11728 1726882227.72565: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp5jb2rmnv" to remote "/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py" <<< 11728 1726882227.72571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py" <<< 11728 1726882227.73126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.73135: stderr chunk (state=3): >>><<< 11728 1726882227.73138: stdout chunk (state=3): >>><<< 11728 1726882227.73183: done transferring module to remote 11728 1726882227.73191: _low_level_execute_command(): starting 11728 1726882227.73200: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/ /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py && sleep 0' 11728 1726882227.73710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.73764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.75564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.75585: stderr chunk (state=3): >>><<< 11728 1726882227.75588: stdout chunk (state=3): >>><<< 11728 1726882227.75608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.75687: _low_level_execute_command(): starting 11728 1726882227.75690: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/AnsiballZ_command.py && sleep 0' 11728 1726882227.76185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.76188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882227.76191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.76197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.76200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.76286: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.76332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.76382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.92469: stdout chunk (state=3): >>> {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 21:30:27.917367", "end": "2024-09-20 21:30:27.920426", "delta": "0:00:00.003059", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882227.93778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882227.93782: stderr chunk (state=3): >>><<< 11728 1726882227.93800: stdout chunk (state=3): >>><<< 11728 1726882227.94069: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "60", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_interval"], "start": "2024-09-20 21:30:27.917367", "end": "2024-09-20 21:30:27.920426", "delta": "0:00:00.003059", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_interval", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882227.94078: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_interval', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882227.94081: _low_level_execute_command(): starting 11728 1726882227.94083: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882227.6751344-14304-255686919835586/ > /dev/null 2>&1 && sleep 0' 11728 1726882227.95201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.95311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.95384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882227.95513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882227.95529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882227.95607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882227.97425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882227.97462: stdout chunk (state=3): >>><<< 11728 1726882227.97498: stderr chunk (state=3): >>><<< 11728 1726882227.97576: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882227.97587: handler run complete 11728 1726882227.97701: Evaluated conditional (False): False 11728 1726882227.98039: variable 'bond_opt' from source: unknown 11728 1726882227.98221: variable 'result' from source: set_fact 11728 1726882227.98225: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882227.98227: attempt loop complete, returning result 11728 1726882227.98229: variable 'bond_opt' from source: unknown 11728 1726882227.98423: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_interval', 'value': '60'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_interval", "value": "60" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_interval" ], "delta": "0:00:00.003059", "end": "2024-09-20 21:30:27.920426", "rc": 0, "start": "2024-09-20 21:30:27.917367" } STDOUT: 60 11728 1726882227.98931: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.98934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.98936: variable 'omit' from source: magic vars 11728 1726882227.98939: variable 'ansible_distribution_major_version' from source: facts 11728 1726882227.98941: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882227.98943: variable 'omit' from source: magic vars 11728 1726882227.98945: variable 'omit' from source: magic vars 11728 1726882227.99100: variable 'controller_device' from source: play vars 11728 1726882227.99103: variable 'bond_opt' from source: unknown 11728 1726882227.99105: variable 'omit' from source: magic vars 11728 1726882227.99108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882227.99110: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.99113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882227.99148: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882227.99151: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.99153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.99366: Set connection var ansible_connection to ssh 11728 1726882227.99369: Set connection var ansible_shell_executable to /bin/sh 11728 1726882227.99371: Set connection var ansible_timeout to 10 11728 1726882227.99374: Set connection var ansible_shell_type to sh 11728 1726882227.99376: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882227.99378: Set connection var ansible_pipelining to False 11728 1726882227.99380: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.99381: variable 'ansible_connection' from source: unknown 11728 1726882227.99383: variable 'ansible_module_compression' from source: unknown 11728 1726882227.99385: variable 'ansible_shell_type' from source: unknown 11728 1726882227.99387: variable 'ansible_shell_executable' from source: unknown 11728 1726882227.99389: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882227.99391: variable 'ansible_pipelining' from source: unknown 11728 1726882227.99397: variable 'ansible_timeout' from source: unknown 11728 1726882227.99400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882227.99402: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882227.99404: variable 'omit' from source: magic vars 11728 1726882227.99406: starting attempt loop 11728 1726882227.99413: running the handler 11728 1726882227.99415: _low_level_execute_command(): starting 11728 1726882227.99417: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882227.99951: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882227.99960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882227.99971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882227.99985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.00000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.00009: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882228.00018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.00233: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882228.00236: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882228.00239: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882228.00240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.00242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.00244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.00246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.00247: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882228.00249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.00251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.00253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.00255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.00424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.01981: stdout chunk (state=3): >>>/root <<< 11728 1726882228.02108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.02124: stderr chunk (state=3): >>><<< 11728 1726882228.02127: stdout chunk (state=3): >>><<< 11728 1726882228.02146: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.02155: _low_level_execute_command(): starting 11728 1726882228.02160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753 `" && echo ansible-tmp-1726882228.021454-14304-195347261973753="` echo /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753 `" ) && sleep 0' 11728 1726882228.03357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.03360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.03527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882228.03531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882228.03533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.03536: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.03538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.03658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.03901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.03909: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.05870: stdout chunk (state=3): >>>ansible-tmp-1726882228.021454-14304-195347261973753=/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753 <<< 11728 1726882228.06006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.06009: stderr chunk (state=3): >>><<< 11728 1726882228.06014: stdout chunk (state=3): >>><<< 11728 1726882228.06035: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882228.021454-14304-195347261973753=/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.06058: variable 'ansible_module_compression' from source: unknown 11728 1726882228.06098: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882228.06122: variable 'ansible_facts' from source: unknown 11728 1726882228.06191: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py 11728 1726882228.06805: Sending initial data 11728 1726882228.06808: Sent initial data (155 bytes) 11728 1726882228.07341: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882228.07349: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.07361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.07375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.07388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.07476: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.07492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.07565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.09179: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882228.09206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882228.09246: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp29dyz6xk /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py <<< 11728 1726882228.09255: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py" <<< 11728 1726882228.09376: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp29dyz6xk" to remote "/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py" <<< 11728 1726882228.10553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.10618: stderr chunk (state=3): >>><<< 11728 1726882228.10622: stdout chunk (state=3): >>><<< 11728 1726882228.10652: done transferring module to remote 11728 1726882228.10659: _low_level_execute_command(): starting 11728 1726882228.10669: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/ /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py && sleep 0' 11728 1726882228.11200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.11215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.11230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.11288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.11292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.11369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.13443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.13446: stdout chunk (state=3): >>><<< 11728 1726882228.13449: stderr chunk (state=3): >>><<< 11728 1726882228.13643: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.13646: _low_level_execute_command(): starting 11728 1726882228.13648: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/AnsiballZ_command.py && sleep 0' 11728 1726882228.14714: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882228.14728: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.14740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.14776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.14799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.14899: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.14925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.14942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.14957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.15113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.30685: stdout chunk (state=3): >>> {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 21:30:28.302267", "end": "2024-09-20 21:30:28.305241", "delta": "0:00:00.002974", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882228.32140: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882228.32170: stderr chunk (state=3): >>><<< 11728 1726882228.32173: stdout chunk (state=3): >>><<< 11728 1726882228.32190: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "192.0.2.128", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_ip_target"], "start": "2024-09-20 21:30:28.302267", "end": "2024-09-20 21:30:28.305241", "delta": "0:00:00.002974", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_ip_target", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882228.32214: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_ip_target', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882228.32219: _low_level_execute_command(): starting 11728 1726882228.32223: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882228.021454-14304-195347261973753/ > /dev/null 2>&1 && sleep 0' 11728 1726882228.32665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.32704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.32707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882228.32709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.32711: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.32714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.32761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.32764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.32770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.32821: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.34606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.34631: stderr chunk (state=3): >>><<< 11728 1726882228.34634: stdout chunk (state=3): >>><<< 11728 1726882228.34651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.34654: handler run complete 11728 1726882228.34670: Evaluated conditional (False): False 11728 1726882228.34779: variable 'bond_opt' from source: unknown 11728 1726882228.34784: variable 'result' from source: set_fact 11728 1726882228.34797: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882228.34808: attempt loop complete, returning result 11728 1726882228.34824: variable 'bond_opt' from source: unknown 11728 1726882228.34873: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_ip_target', 'value': '192.0.2.128'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_ip_target", "value": "192.0.2.128" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_ip_target" ], "delta": "0:00:00.002974", "end": "2024-09-20 21:30:28.305241", "rc": 0, "start": "2024-09-20 21:30:28.302267" } STDOUT: 192.0.2.128 11728 1726882228.35006: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.35009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.35011: variable 'omit' from source: magic vars 11728 1726882228.35089: variable 'ansible_distribution_major_version' from source: facts 11728 1726882228.35092: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882228.35101: variable 'omit' from source: magic vars 11728 1726882228.35113: variable 'omit' from source: magic vars 11728 1726882228.35222: variable 'controller_device' from source: play vars 11728 1726882228.35225: variable 'bond_opt' from source: unknown 11728 1726882228.35242: variable 'omit' from source: magic vars 11728 1726882228.35258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882228.35264: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882228.35270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882228.35280: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882228.35283: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.35285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.35335: Set connection var ansible_connection to ssh 11728 1726882228.35347: Set connection var ansible_shell_executable to /bin/sh 11728 1726882228.35350: Set connection var ansible_timeout to 10 11728 1726882228.35352: Set connection var ansible_shell_type to sh 11728 1726882228.35355: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882228.35360: Set connection var ansible_pipelining to False 11728 1726882228.35375: variable 'ansible_shell_executable' from source: unknown 11728 1726882228.35378: variable 'ansible_connection' from source: unknown 11728 1726882228.35380: variable 'ansible_module_compression' from source: unknown 11728 1726882228.35383: variable 'ansible_shell_type' from source: unknown 11728 1726882228.35385: variable 'ansible_shell_executable' from source: unknown 11728 1726882228.35387: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.35391: variable 'ansible_pipelining' from source: unknown 11728 1726882228.35395: variable 'ansible_timeout' from source: unknown 11728 1726882228.35401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.35464: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882228.35470: variable 'omit' from source: magic vars 11728 1726882228.35473: starting attempt loop 11728 1726882228.35475: running the handler 11728 1726882228.35482: _low_level_execute_command(): starting 11728 1726882228.35485: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882228.35924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.35928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882228.35930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882228.35932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.35980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.35983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.35986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.36037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.37629: stdout chunk (state=3): >>>/root <<< 11728 1726882228.37727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.37754: stderr chunk (state=3): >>><<< 11728 1726882228.37757: stdout chunk (state=3): >>><<< 11728 1726882228.37769: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.37776: _low_level_execute_command(): starting 11728 1726882228.37780: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438 `" && echo ansible-tmp-1726882228.3776824-14304-150769718635438="` echo /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438 `" ) && sleep 0' 11728 1726882228.38180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.38214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.38217: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882228.38219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.38221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.38223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.38274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.38280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.38283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.38327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.40178: stdout chunk (state=3): >>>ansible-tmp-1726882228.3776824-14304-150769718635438=/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438 <<< 11728 1726882228.40283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.40310: stderr chunk (state=3): >>><<< 11728 1726882228.40313: stdout chunk (state=3): >>><<< 11728 1726882228.40327: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882228.3776824-14304-150769718635438=/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.40344: variable 'ansible_module_compression' from source: unknown 11728 1726882228.40374: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882228.40388: variable 'ansible_facts' from source: unknown 11728 1726882228.40436: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py 11728 1726882228.40524: Sending initial data 11728 1726882228.40528: Sent initial data (156 bytes) 11728 1726882228.40953: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.40956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882228.40959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.40961: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.40964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.41006: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.41023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.41067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.42607: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882228.42661: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882228.42719: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjdzsph72 /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py <<< 11728 1726882228.42722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py" <<< 11728 1726882228.42760: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjdzsph72" to remote "/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py" <<< 11728 1726882228.43668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.43671: stdout chunk (state=3): >>><<< 11728 1726882228.43673: stderr chunk (state=3): >>><<< 11728 1726882228.43675: done transferring module to remote 11728 1726882228.43677: _low_level_execute_command(): starting 11728 1726882228.43679: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/ /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py && sleep 0' 11728 1726882228.44310: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.44338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.44355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.44371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.44438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.46210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.46224: stdout chunk (state=3): >>><<< 11728 1726882228.46309: stderr chunk (state=3): >>><<< 11728 1726882228.46313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.46316: _low_level_execute_command(): starting 11728 1726882228.46319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/AnsiballZ_command.py && sleep 0' 11728 1726882228.46878: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882228.46891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.46911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.46930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.46946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.46982: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.47053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.47072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.47118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.47168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.62881: stdout chunk (state=3): >>> {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 21:30:28.624079", "end": "2024-09-20 21:30:28.627153", "delta": "0:00:00.003074", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882228.64616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882228.64620: stdout chunk (state=3): >>><<< 11728 1726882228.64622: stderr chunk (state=3): >>><<< 11728 1726882228.64624: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "none 0", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/arp_validate"], "start": "2024-09-20 21:30:28.624079", "end": "2024-09-20 21:30:28.627153", "delta": "0:00:00.003074", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/arp_validate", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882228.64633: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/arp_validate', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882228.64811: _low_level_execute_command(): starting 11728 1726882228.64815: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882228.3776824-14304-150769718635438/ > /dev/null 2>&1 && sleep 0' 11728 1726882228.65942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882228.66102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.66320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.66401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.68274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.68292: stderr chunk (state=3): >>><<< 11728 1726882228.68585: stdout chunk (state=3): >>><<< 11728 1726882228.68589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.68591: handler run complete 11728 1726882228.68597: Evaluated conditional (False): False 11728 1726882228.68819: variable 'bond_opt' from source: unknown 11728 1726882228.68928: variable 'result' from source: set_fact 11728 1726882228.68931: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882228.68933: attempt loop complete, returning result 11728 1726882228.68936: variable 'bond_opt' from source: unknown 11728 1726882228.69010: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'arp_validate', 'value': 'none'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "arp_validate", "value": "none" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/arp_validate" ], "delta": "0:00:00.003074", "end": "2024-09-20 21:30:28.627153", "rc": 0, "start": "2024-09-20 21:30:28.624079" } STDOUT: none 0 11728 1726882228.69398: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.69487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.69491: variable 'omit' from source: magic vars 11728 1726882228.69612: variable 'ansible_distribution_major_version' from source: facts 11728 1726882228.69624: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882228.69632: variable 'omit' from source: magic vars 11728 1726882228.69649: variable 'omit' from source: magic vars 11728 1726882228.69824: variable 'controller_device' from source: play vars 11728 1726882228.69833: variable 'bond_opt' from source: unknown 11728 1726882228.69855: variable 'omit' from source: magic vars 11728 1726882228.69878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882228.69890: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882228.69905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882228.69929: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882228.69937: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.69944: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.70018: Set connection var ansible_connection to ssh 11728 1726882228.70038: Set connection var ansible_shell_executable to /bin/sh 11728 1726882228.70047: Set connection var ansible_timeout to 10 11728 1726882228.70053: Set connection var ansible_shell_type to sh 11728 1726882228.70100: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882228.70103: Set connection var ansible_pipelining to False 11728 1726882228.70105: variable 'ansible_shell_executable' from source: unknown 11728 1726882228.70107: variable 'ansible_connection' from source: unknown 11728 1726882228.70109: variable 'ansible_module_compression' from source: unknown 11728 1726882228.70111: variable 'ansible_shell_type' from source: unknown 11728 1726882228.70115: variable 'ansible_shell_executable' from source: unknown 11728 1726882228.70122: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882228.70129: variable 'ansible_pipelining' from source: unknown 11728 1726882228.70143: variable 'ansible_timeout' from source: unknown 11728 1726882228.70150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882228.70248: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882228.70346: variable 'omit' from source: magic vars 11728 1726882228.70357: starting attempt loop 11728 1726882228.70360: running the handler 11728 1726882228.70362: _low_level_execute_command(): starting 11728 1726882228.70364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882228.71057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882228.71075: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.71092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.71126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.71138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882228.71210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.71230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.71250: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.71261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.71362: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.73552: stdout chunk (state=3): >>>/root <<< 11728 1726882228.73556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.73558: stdout chunk (state=3): >>><<< 11728 1726882228.73560: stderr chunk (state=3): >>><<< 11728 1726882228.73563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.73570: _low_level_execute_command(): starting 11728 1726882228.73572: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062 `" && echo ansible-tmp-1726882228.732852-14304-36337781574062="` echo /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062 `" ) && sleep 0' 11728 1726882228.74683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.74910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.74977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.76838: stdout chunk (state=3): >>>ansible-tmp-1726882228.732852-14304-36337781574062=/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062 <<< 11728 1726882228.76933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.77000: stderr chunk (state=3): >>><<< 11728 1726882228.77004: stdout chunk (state=3): >>><<< 11728 1726882228.77006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882228.732852-14304-36337781574062=/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.77020: variable 'ansible_module_compression' from source: unknown 11728 1726882228.77303: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882228.77307: variable 'ansible_facts' from source: unknown 11728 1726882228.77342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py 11728 1726882228.77816: Sending initial data 11728 1726882228.77825: Sent initial data (154 bytes) 11728 1726882228.78736: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.78747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.78825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.79016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.79077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.80807: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882228.80811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882228.80813: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpji_31m_g /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py <<< 11728 1726882228.80816: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py" <<< 11728 1726882228.80926: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpji_31m_g" to remote "/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py" <<< 11728 1726882228.81916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.81920: stderr chunk (state=3): >>><<< 11728 1726882228.81925: stdout chunk (state=3): >>><<< 11728 1726882228.81978: done transferring module to remote 11728 1726882228.81986: _low_level_execute_command(): starting 11728 1726882228.81991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/ /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py && sleep 0' 11728 1726882228.83089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.83092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.83214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.83223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.83241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882228.83247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.83401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.83428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882228.83434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.83450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.83707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882228.85560: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882228.85623: stderr chunk (state=3): >>><<< 11728 1726882228.85626: stdout chunk (state=3): >>><<< 11728 1726882228.85645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882228.85648: _low_level_execute_command(): starting 11728 1726882228.85652: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/AnsiballZ_command.py && sleep 0' 11728 1726882228.87009: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882228.87016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.87039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882228.87046: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882228.87054: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882228.87061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882228.87071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882228.87083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882228.87100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882228.87267: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882228.87289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882228.87299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882228.87388: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.02778: stdout chunk (state=3): >>> {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 21:30:29.023076", "end": "2024-09-20 21:30:29.026152", "delta": "0:00:00.003076", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882229.04421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882229.04467: stderr chunk (state=3): >>><<< 11728 1726882229.04470: stdout chunk (state=3): >>><<< 11728 1726882229.04702: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "test1", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/nm-bond/bonding/primary"], "start": "2024-09-20 21:30:29.023076", "end": "2024-09-20 21:30:29.026152", "delta": "0:00:00.003076", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/nm-bond/bonding/primary", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882229.04705: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/nm-bond/bonding/primary', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882229.04713: _low_level_execute_command(): starting 11728 1726882229.04715: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882228.732852-14304-36337781574062/ > /dev/null 2>&1 && sleep 0' 11728 1726882229.05125: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.05134: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.05146: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.05160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.05172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882229.05180: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882229.05190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.05212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882229.05221: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882229.05228: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882229.05236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.05246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.05262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.05268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882229.05271: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882229.05279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.05351: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.05374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.05386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.05464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.07399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.07402: stdout chunk (state=3): >>><<< 11728 1726882229.07404: stderr chunk (state=3): >>><<< 11728 1726882229.07407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.07409: handler run complete 11728 1726882229.07411: Evaluated conditional (False): False 11728 1726882229.07496: variable 'bond_opt' from source: unknown 11728 1726882229.07504: variable 'result' from source: set_fact 11728 1726882229.07518: Evaluated conditional (bond_opt.value in result.stdout): True 11728 1726882229.07529: attempt loop complete, returning result 11728 1726882229.07545: variable 'bond_opt' from source: unknown 11728 1726882229.07615: variable 'bond_opt' from source: unknown ok: [managed_node3] => (item={'key': 'primary', 'value': 'test1'}) => { "ansible_loop_var": "bond_opt", "attempts": 1, "bond_opt": { "key": "primary", "value": "test1" }, "changed": false, "cmd": [ "cat", "/sys/class/net/nm-bond/bonding/primary" ], "delta": "0:00:00.003076", "end": "2024-09-20 21:30:29.026152", "rc": 0, "start": "2024-09-20 21:30:29.023076" } STDOUT: test1 11728 1726882229.07747: dumping result to json 11728 1726882229.07750: done dumping result, returning 11728 1726882229.07752: done running TaskExecutor() for managed_node3/TASK: ** TEST check bond settings [12673a56-9f93-5c28-a762-000000000c2a] 11728 1726882229.07755: sending task result for task 12673a56-9f93-5c28-a762-000000000c2a 11728 1726882229.08141: no more pending results, returning what we have 11728 1726882229.08145: results queue empty 11728 1726882229.08146: checking for any_errors_fatal 11728 1726882229.08148: done checking for any_errors_fatal 11728 1726882229.08148: checking for max_fail_percentage 11728 1726882229.08150: done checking for max_fail_percentage 11728 1726882229.08150: checking to see if all hosts have failed and the running result is not ok 11728 1726882229.08151: done checking to see if all hosts have failed 11728 1726882229.08152: getting the remaining hosts for this loop 11728 1726882229.08153: done getting the remaining hosts for this loop 11728 1726882229.08155: getting the next task for host managed_node3 11728 1726882229.08161: done getting next task for host managed_node3 11728 1726882229.08163: ^ task is: TASK: Include the task 'assert_IPv4_present.yml' 11728 1726882229.08166: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882229.08169: getting variables 11728 1726882229.08170: in VariableManager get_vars() 11728 1726882229.08210: Calling all_inventory to load vars for managed_node3 11728 1726882229.08213: Calling groups_inventory to load vars for managed_node3 11728 1726882229.08215: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.08221: done sending task result for task 12673a56-9f93-5c28-a762-000000000c2a 11728 1726882229.08223: WORKER PROCESS EXITING 11728 1726882229.08231: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.08234: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.08236: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.09652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.11156: done with get_vars() 11728 1726882229.11180: done getting variables TASK [Include the task 'assert_IPv4_present.yml'] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:11 Friday 20 September 2024 21:30:29 -0400 (0:00:01.865) 0:00:53.965 ****** 11728 1726882229.11285: entering _queue_task() for managed_node3/include_tasks 11728 1726882229.11632: worker is 1 (out of 1 available) 11728 1726882229.11646: exiting _queue_task() for managed_node3/include_tasks 11728 1726882229.11659: done queuing things up, now waiting for results queue to drain 11728 1726882229.11661: waiting for pending results... 11728 1726882229.11951: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' 11728 1726882229.12066: in run() - task 12673a56-9f93-5c28-a762-000000000c2c 11728 1726882229.12083: variable 'ansible_search_path' from source: unknown 11728 1726882229.12089: variable 'ansible_search_path' from source: unknown 11728 1726882229.12138: calling self._execute() 11728 1726882229.12254: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.12266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.12280: variable 'omit' from source: magic vars 11728 1726882229.12637: variable 'ansible_distribution_major_version' from source: facts 11728 1726882229.12656: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882229.12668: _execute() done 11728 1726882229.12770: dumping result to json 11728 1726882229.12774: done dumping result, returning 11728 1726882229.12776: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv4_present.yml' [12673a56-9f93-5c28-a762-000000000c2c] 11728 1726882229.12779: sending task result for task 12673a56-9f93-5c28-a762-000000000c2c 11728 1726882229.12849: done sending task result for task 12673a56-9f93-5c28-a762-000000000c2c 11728 1726882229.12853: WORKER PROCESS EXITING 11728 1726882229.12880: no more pending results, returning what we have 11728 1726882229.12885: in VariableManager get_vars() 11728 1726882229.12940: Calling all_inventory to load vars for managed_node3 11728 1726882229.12944: Calling groups_inventory to load vars for managed_node3 11728 1726882229.12946: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.12960: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.12963: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.12966: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.14483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.16199: done with get_vars() 11728 1726882229.16218: variable 'ansible_search_path' from source: unknown 11728 1726882229.16220: variable 'ansible_search_path' from source: unknown 11728 1726882229.16231: variable 'item' from source: include params 11728 1726882229.16360: variable 'item' from source: include params 11728 1726882229.16401: we have included files to process 11728 1726882229.16402: generating all_blocks data 11728 1726882229.16405: done generating all_blocks data 11728 1726882229.16410: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882229.16411: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882229.16414: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml 11728 1726882229.16649: done processing included file 11728 1726882229.16658: iterating over new_blocks loaded from include file 11728 1726882229.16660: in VariableManager get_vars() 11728 1726882229.16684: done with get_vars() 11728 1726882229.16686: filtering new block on tags 11728 1726882229.16722: done filtering new block on tags 11728 1726882229.16725: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml for managed_node3 11728 1726882229.16731: extending task lists for all hosts with included blocks 11728 1726882229.16974: done extending task lists 11728 1726882229.16976: done processing included files 11728 1726882229.16977: results queue empty 11728 1726882229.16985: checking for any_errors_fatal 11728 1726882229.16998: done checking for any_errors_fatal 11728 1726882229.17000: checking for max_fail_percentage 11728 1726882229.17001: done checking for max_fail_percentage 11728 1726882229.17002: checking to see if all hosts have failed and the running result is not ok 11728 1726882229.17003: done checking to see if all hosts have failed 11728 1726882229.17003: getting the remaining hosts for this loop 11728 1726882229.17005: done getting the remaining hosts for this loop 11728 1726882229.17008: getting the next task for host managed_node3 11728 1726882229.17013: done getting next task for host managed_node3 11728 1726882229.17015: ^ task is: TASK: ** TEST check IPv4 11728 1726882229.17018: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882229.17020: getting variables 11728 1726882229.17021: in VariableManager get_vars() 11728 1726882229.17036: Calling all_inventory to load vars for managed_node3 11728 1726882229.17039: Calling groups_inventory to load vars for managed_node3 11728 1726882229.17041: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.17047: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.17049: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.17052: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.18612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.20512: done with get_vars() 11728 1726882229.20544: done getting variables 11728 1726882229.20586: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv4_present.yml:3 Friday 20 September 2024 21:30:29 -0400 (0:00:00.093) 0:00:54.058 ****** 11728 1726882229.20623: entering _queue_task() for managed_node3/command 11728 1726882229.21119: worker is 1 (out of 1 available) 11728 1726882229.21133: exiting _queue_task() for managed_node3/command 11728 1726882229.21145: done queuing things up, now waiting for results queue to drain 11728 1726882229.21146: waiting for pending results... 11728 1726882229.21713: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 11728 1726882229.21801: in run() - task 12673a56-9f93-5c28-a762-000000000da6 11728 1726882229.21806: variable 'ansible_search_path' from source: unknown 11728 1726882229.21810: variable 'ansible_search_path' from source: unknown 11728 1726882229.21813: calling self._execute() 11728 1726882229.21816: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.21818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.21821: variable 'omit' from source: magic vars 11728 1726882229.22713: variable 'ansible_distribution_major_version' from source: facts 11728 1726882229.22717: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882229.22720: variable 'omit' from source: magic vars 11728 1726882229.22831: variable 'omit' from source: magic vars 11728 1726882229.23224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882229.28989: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882229.29203: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882229.29355: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882229.29391: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882229.29431: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882229.29559: variable 'interface' from source: include params 11728 1726882229.29567: variable 'controller_device' from source: play vars 11728 1726882229.29831: variable 'controller_device' from source: play vars 11728 1726882229.29858: variable 'omit' from source: magic vars 11728 1726882229.29890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882229.29922: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882229.29940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882229.29957: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882229.29968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882229.30323: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882229.30327: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.30331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.30616: Set connection var ansible_connection to ssh 11728 1726882229.30620: Set connection var ansible_shell_executable to /bin/sh 11728 1726882229.30622: Set connection var ansible_timeout to 10 11728 1726882229.30624: Set connection var ansible_shell_type to sh 11728 1726882229.30657: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882229.30660: Set connection var ansible_pipelining to False 11728 1726882229.30688: variable 'ansible_shell_executable' from source: unknown 11728 1726882229.30692: variable 'ansible_connection' from source: unknown 11728 1726882229.30697: variable 'ansible_module_compression' from source: unknown 11728 1726882229.30699: variable 'ansible_shell_type' from source: unknown 11728 1726882229.30702: variable 'ansible_shell_executable' from source: unknown 11728 1726882229.30707: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.30709: variable 'ansible_pipelining' from source: unknown 11728 1726882229.30714: variable 'ansible_timeout' from source: unknown 11728 1726882229.30716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.30942: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882229.30952: variable 'omit' from source: magic vars 11728 1726882229.30958: starting attempt loop 11728 1726882229.30961: running the handler 11728 1726882229.31039: _low_level_execute_command(): starting 11728 1726882229.31051: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882229.31914: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.32008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.32039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.32120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.33983: stdout chunk (state=3): >>>/root <<< 11728 1726882229.34034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.34038: stdout chunk (state=3): >>><<< 11728 1726882229.34041: stderr chunk (state=3): >>><<< 11728 1726882229.34107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.34110: _low_level_execute_command(): starting 11728 1726882229.34113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513 `" && echo ansible-tmp-1726882229.3407412-14406-138355329119513="` echo /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513 `" ) && sleep 0' 11728 1726882229.34711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.34772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.34792: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.34830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.34870: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.36755: stdout chunk (state=3): >>>ansible-tmp-1726882229.3407412-14406-138355329119513=/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513 <<< 11728 1726882229.36867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.36870: stdout chunk (state=3): >>><<< 11728 1726882229.36901: stderr chunk (state=3): >>><<< 11728 1726882229.36982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882229.3407412-14406-138355329119513=/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.37015: variable 'ansible_module_compression' from source: unknown 11728 1726882229.37061: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882229.37192: variable 'ansible_facts' from source: unknown 11728 1726882229.37308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py 11728 1726882229.37588: Sending initial data 11728 1726882229.37637: Sent initial data (156 bytes) 11728 1726882229.38184: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.38256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.38279: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.38320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.38367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.39877: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882229.39918: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882229.39975: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvhmz2zk4 /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py <<< 11728 1726882229.39978: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py" <<< 11728 1726882229.40004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpvhmz2zk4" to remote "/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py" <<< 11728 1726882229.40966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.41304: stderr chunk (state=3): >>><<< 11728 1726882229.41308: stdout chunk (state=3): >>><<< 11728 1726882229.41404: done transferring module to remote 11728 1726882229.41408: _low_level_execute_command(): starting 11728 1726882229.41410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/ /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py && sleep 0' 11728 1726882229.42216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.42274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.42345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.42362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.42391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.42473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.44225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.44229: stdout chunk (state=3): >>><<< 11728 1726882229.44231: stderr chunk (state=3): >>><<< 11728 1726882229.44315: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.44468: _low_level_execute_command(): starting 11728 1726882229.44473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/AnsiballZ_command.py && sleep 0' 11728 1726882229.45051: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.45061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.45072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.45126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.45130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882229.45132: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882229.45135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.45137: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882229.45139: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882229.45146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882229.45155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.45216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.45290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.45296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.45299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.45343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.61520: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:29.607431", "end": "2024-09-20 21:30:29.611225", "delta": "0:00:00.003794", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882229.62820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882229.62859: stderr chunk (state=3): >>><<< 11728 1726882229.62862: stdout chunk (state=3): >>><<< 11728 1726882229.62987: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:29.607431", "end": "2024-09-20 21:30:29.611225", "delta": "0:00:00.003794", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882229.62991: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882229.62997: _low_level_execute_command(): starting 11728 1726882229.62999: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882229.3407412-14406-138355329119513/ > /dev/null 2>&1 && sleep 0' 11728 1726882229.63526: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.63541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.63608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.63653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.63669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.63692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.63771: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.65855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.65859: stdout chunk (state=3): >>><<< 11728 1726882229.65901: stderr chunk (state=3): >>><<< 11728 1726882229.65932: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.65938: handler run complete 11728 1726882229.65961: Evaluated conditional (False): False 11728 1726882229.66382: variable 'address' from source: include params 11728 1726882229.66386: variable 'result' from source: set_fact 11728 1726882229.66498: Evaluated conditional (address in result.stdout): True 11728 1726882229.66502: attempt loop complete, returning result 11728 1726882229.66504: _execute() done 11728 1726882229.66506: dumping result to json 11728 1726882229.66508: done dumping result, returning 11728 1726882229.66510: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [12673a56-9f93-5c28-a762-000000000da6] 11728 1726882229.66512: sending task result for task 12673a56-9f93-5c28-a762-000000000da6 11728 1726882229.66785: done sending task result for task 12673a56-9f93-5c28-a762-000000000da6 11728 1726882229.66788: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003794", "end": "2024-09-20 21:30:29.611225", "rc": 0, "start": "2024-09-20 21:30:29.607431" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.214/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 237sec preferred_lft 237sec 11728 1726882229.66876: no more pending results, returning what we have 11728 1726882229.66881: results queue empty 11728 1726882229.66882: checking for any_errors_fatal 11728 1726882229.66883: done checking for any_errors_fatal 11728 1726882229.66884: checking for max_fail_percentage 11728 1726882229.66886: done checking for max_fail_percentage 11728 1726882229.66887: checking to see if all hosts have failed and the running result is not ok 11728 1726882229.66888: done checking to see if all hosts have failed 11728 1726882229.66888: getting the remaining hosts for this loop 11728 1726882229.66890: done getting the remaining hosts for this loop 11728 1726882229.66897: getting the next task for host managed_node3 11728 1726882229.66907: done getting next task for host managed_node3 11728 1726882229.66910: ^ task is: TASK: Include the task 'assert_IPv6_present.yml' 11728 1726882229.66914: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882229.66919: getting variables 11728 1726882229.66920: in VariableManager get_vars() 11728 1726882229.66965: Calling all_inventory to load vars for managed_node3 11728 1726882229.66968: Calling groups_inventory to load vars for managed_node3 11728 1726882229.66970: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.66981: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.66984: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.66987: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.70133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.73161: done with get_vars() 11728 1726882229.73197: done getting variables TASK [Include the task 'assert_IPv6_present.yml'] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:16 Friday 20 September 2024 21:30:29 -0400 (0:00:00.526) 0:00:54.585 ****** 11728 1726882229.73311: entering _queue_task() for managed_node3/include_tasks 11728 1726882229.73762: worker is 1 (out of 1 available) 11728 1726882229.73776: exiting _queue_task() for managed_node3/include_tasks 11728 1726882229.73790: done queuing things up, now waiting for results queue to drain 11728 1726882229.73792: waiting for pending results... 11728 1726882229.74162: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' 11728 1726882229.74234: in run() - task 12673a56-9f93-5c28-a762-000000000c2d 11728 1726882229.74263: variable 'ansible_search_path' from source: unknown 11728 1726882229.74272: variable 'ansible_search_path' from source: unknown 11728 1726882229.74319: calling self._execute() 11728 1726882229.74426: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.74438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.74476: variable 'omit' from source: magic vars 11728 1726882229.74857: variable 'ansible_distribution_major_version' from source: facts 11728 1726882229.74876: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882229.74919: _execute() done 11728 1726882229.75001: dumping result to json 11728 1726882229.75005: done dumping result, returning 11728 1726882229.75008: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_IPv6_present.yml' [12673a56-9f93-5c28-a762-000000000c2d] 11728 1726882229.75010: sending task result for task 12673a56-9f93-5c28-a762-000000000c2d 11728 1726882229.75083: done sending task result for task 12673a56-9f93-5c28-a762-000000000c2d 11728 1726882229.75086: WORKER PROCESS EXITING 11728 1726882229.75117: no more pending results, returning what we have 11728 1726882229.75123: in VariableManager get_vars() 11728 1726882229.75173: Calling all_inventory to load vars for managed_node3 11728 1726882229.75177: Calling groups_inventory to load vars for managed_node3 11728 1726882229.75180: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.75197: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.75201: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.75204: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.77268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.79343: done with get_vars() 11728 1726882229.79364: variable 'ansible_search_path' from source: unknown 11728 1726882229.79366: variable 'ansible_search_path' from source: unknown 11728 1726882229.79375: variable 'item' from source: include params 11728 1726882229.79483: variable 'item' from source: include params 11728 1726882229.79523: we have included files to process 11728 1726882229.79525: generating all_blocks data 11728 1726882229.79527: done generating all_blocks data 11728 1726882229.79531: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882229.79532: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882229.79534: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml 11728 1726882229.79742: done processing included file 11728 1726882229.79744: iterating over new_blocks loaded from include file 11728 1726882229.79752: in VariableManager get_vars() 11728 1726882229.79775: done with get_vars() 11728 1726882229.79777: filtering new block on tags 11728 1726882229.79834: done filtering new block on tags 11728 1726882229.79837: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml for managed_node3 11728 1726882229.79842: extending task lists for all hosts with included blocks 11728 1726882229.80558: done extending task lists 11728 1726882229.80559: done processing included files 11728 1726882229.80560: results queue empty 11728 1726882229.80560: checking for any_errors_fatal 11728 1726882229.80566: done checking for any_errors_fatal 11728 1726882229.80566: checking for max_fail_percentage 11728 1726882229.80568: done checking for max_fail_percentage 11728 1726882229.80568: checking to see if all hosts have failed and the running result is not ok 11728 1726882229.80569: done checking to see if all hosts have failed 11728 1726882229.80570: getting the remaining hosts for this loop 11728 1726882229.80571: done getting the remaining hosts for this loop 11728 1726882229.80573: getting the next task for host managed_node3 11728 1726882229.80578: done getting next task for host managed_node3 11728 1726882229.80580: ^ task is: TASK: ** TEST check IPv6 11728 1726882229.80583: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882229.80585: getting variables 11728 1726882229.80586: in VariableManager get_vars() 11728 1726882229.80606: Calling all_inventory to load vars for managed_node3 11728 1726882229.80609: Calling groups_inventory to load vars for managed_node3 11728 1726882229.80611: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882229.80616: Calling all_plugins_play to load vars for managed_node3 11728 1726882229.80618: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882229.80621: Calling groups_plugins_play to load vars for managed_node3 11728 1726882229.82353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882229.84710: done with get_vars() 11728 1726882229.84805: done getting variables 11728 1726882229.84925: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_IPv6_present.yml:3 Friday 20 September 2024 21:30:29 -0400 (0:00:00.116) 0:00:54.702 ****** 11728 1726882229.84966: entering _queue_task() for managed_node3/command 11728 1726882229.85257: worker is 1 (out of 1 available) 11728 1726882229.85270: exiting _queue_task() for managed_node3/command 11728 1726882229.85281: done queuing things up, now waiting for results queue to drain 11728 1726882229.85283: waiting for pending results... 11728 1726882229.85469: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 11728 1726882229.85554: in run() - task 12673a56-9f93-5c28-a762-000000000dc7 11728 1726882229.85567: variable 'ansible_search_path' from source: unknown 11728 1726882229.85570: variable 'ansible_search_path' from source: unknown 11728 1726882229.85600: calling self._execute() 11728 1726882229.85672: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.85676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.85686: variable 'omit' from source: magic vars 11728 1726882229.85962: variable 'ansible_distribution_major_version' from source: facts 11728 1726882229.85971: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882229.85977: variable 'omit' from source: magic vars 11728 1726882229.86021: variable 'omit' from source: magic vars 11728 1726882229.86139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882229.88817: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882229.88865: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882229.88894: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882229.88923: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882229.88943: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882229.89007: variable 'controller_device' from source: play vars 11728 1726882229.89025: variable 'omit' from source: magic vars 11728 1726882229.89048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882229.89069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882229.89083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882229.89097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882229.89111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882229.89133: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882229.89136: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.89139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.89203: Set connection var ansible_connection to ssh 11728 1726882229.89213: Set connection var ansible_shell_executable to /bin/sh 11728 1726882229.89216: Set connection var ansible_timeout to 10 11728 1726882229.89219: Set connection var ansible_shell_type to sh 11728 1726882229.89228: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882229.89232: Set connection var ansible_pipelining to False 11728 1726882229.89250: variable 'ansible_shell_executable' from source: unknown 11728 1726882229.89253: variable 'ansible_connection' from source: unknown 11728 1726882229.89256: variable 'ansible_module_compression' from source: unknown 11728 1726882229.89258: variable 'ansible_shell_type' from source: unknown 11728 1726882229.89261: variable 'ansible_shell_executable' from source: unknown 11728 1726882229.89263: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882229.89265: variable 'ansible_pipelining' from source: unknown 11728 1726882229.89268: variable 'ansible_timeout' from source: unknown 11728 1726882229.89272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882229.89348: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882229.89357: variable 'omit' from source: magic vars 11728 1726882229.89362: starting attempt loop 11728 1726882229.89364: running the handler 11728 1726882229.89377: _low_level_execute_command(): starting 11728 1726882229.89383: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882229.89845: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.89849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.89852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.89854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.89901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.89918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.89995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.91703: stdout chunk (state=3): >>>/root <<< 11728 1726882229.91849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.91852: stdout chunk (state=3): >>><<< 11728 1726882229.91855: stderr chunk (state=3): >>><<< 11728 1726882229.91875: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.91966: _low_level_execute_command(): starting 11728 1726882229.91969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927 `" && echo ansible-tmp-1726882229.918836-14440-22706974787927="` echo /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927 `" ) && sleep 0' 11728 1726882229.92490: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.92510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.92529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.92546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.92564: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882229.92577: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882229.92615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.92633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.92711: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882229.92724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.92740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.92766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.92845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.94710: stdout chunk (state=3): >>>ansible-tmp-1726882229.918836-14440-22706974787927=/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927 <<< 11728 1726882229.94853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.94856: stdout chunk (state=3): >>><<< 11728 1726882229.94858: stderr chunk (state=3): >>><<< 11728 1726882229.94901: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882229.918836-14440-22706974787927=/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882229.94919: variable 'ansible_module_compression' from source: unknown 11728 1726882229.94978: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882229.95081: variable 'ansible_facts' from source: unknown 11728 1726882229.95120: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py 11728 1726882229.95375: Sending initial data 11728 1726882229.95387: Sent initial data (154 bytes) 11728 1726882229.95912: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.95929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.95935: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882229.96010: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882229.96038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.96042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.96107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882229.97611: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11728 1726882229.97647: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882229.97700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882229.97753: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmptw24huqy /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py <<< 11728 1726882229.97756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py" <<< 11728 1726882229.97795: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmptw24huqy" to remote "/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py" <<< 11728 1726882229.98510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882229.98561: stderr chunk (state=3): >>><<< 11728 1726882229.98573: stdout chunk (state=3): >>><<< 11728 1726882229.98723: done transferring module to remote 11728 1726882229.98727: _low_level_execute_command(): starting 11728 1726882229.98729: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/ /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py && sleep 0' 11728 1726882229.99274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882229.99295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882229.99314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882229.99428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882229.99433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882229.99458: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882229.99528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882230.01326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882230.01330: stdout chunk (state=3): >>><<< 11728 1726882230.01500: stderr chunk (state=3): >>><<< 11728 1726882230.01504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882230.01507: _low_level_execute_command(): starting 11728 1726882230.01510: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/AnsiballZ_command.py && sleep 0' 11728 1726882230.02016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882230.02038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882230.02109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882230.02156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882230.02168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882230.02186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882230.02274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882230.18119: stdout chunk (state=3): >>> {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::129/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::74da:5eff:fead:eb5b/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::74da:5eff:fead:eb5b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:30.176051", "end": "2024-09-20 21:30:30.179541", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882230.19822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882230.19826: stdout chunk (state=3): >>><<< 11728 1726882230.19828: stderr chunk (state=3): >>><<< 11728 1726882230.20058: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::129/128 scope global dynamic noprefixroute \n valid_lft 238sec preferred_lft 238sec\n inet6 2001:db8::74da:5eff:fead:eb5b/64 scope global dynamic noprefixroute \n valid_lft 1798sec preferred_lft 1798sec\n inet6 fe80::74da:5eff:fead:eb5b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:30:30.176051", "end": "2024-09-20 21:30:30.179541", "delta": "0:00:00.003490", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882230.20067: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882230.20070: _low_level_execute_command(): starting 11728 1726882230.20072: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882229.918836-14440-22706974787927/ > /dev/null 2>&1 && sleep 0' 11728 1726882230.20644: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882230.20657: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882230.20670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882230.20686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882230.20746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882230.20807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882230.20826: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882230.20853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882230.20975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882230.22806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882230.22831: stdout chunk (state=3): >>><<< 11728 1726882230.22834: stderr chunk (state=3): >>><<< 11728 1726882230.22851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882230.22999: handler run complete 11728 1726882230.23002: Evaluated conditional (False): False 11728 1726882230.23046: variable 'address' from source: include params 11728 1726882230.23056: variable 'result' from source: set_fact 11728 1726882230.23076: Evaluated conditional (address in result.stdout): True 11728 1726882230.23096: attempt loop complete, returning result 11728 1726882230.23104: _execute() done 11728 1726882230.23112: dumping result to json 11728 1726882230.23121: done dumping result, returning 11728 1726882230.23134: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [12673a56-9f93-5c28-a762-000000000dc7] 11728 1726882230.23144: sending task result for task 12673a56-9f93-5c28-a762-000000000dc7 11728 1726882230.23268: done sending task result for task 12673a56-9f93-5c28-a762-000000000dc7 11728 1726882230.23275: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003490", "end": "2024-09-20 21:30:30.179541", "rc": 0, "start": "2024-09-20 21:30:30.176051" } STDOUT: 24: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::129/128 scope global dynamic noprefixroute valid_lft 238sec preferred_lft 238sec inet6 2001:db8::74da:5eff:fead:eb5b/64 scope global dynamic noprefixroute valid_lft 1798sec preferred_lft 1798sec inet6 fe80::74da:5eff:fead:eb5b/64 scope link noprefixroute valid_lft forever preferred_lft forever 11728 1726882230.23353: no more pending results, returning what we have 11728 1726882230.23357: results queue empty 11728 1726882230.23358: checking for any_errors_fatal 11728 1726882230.23359: done checking for any_errors_fatal 11728 1726882230.23360: checking for max_fail_percentage 11728 1726882230.23362: done checking for max_fail_percentage 11728 1726882230.23363: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.23363: done checking to see if all hosts have failed 11728 1726882230.23364: getting the remaining hosts for this loop 11728 1726882230.23366: done getting the remaining hosts for this loop 11728 1726882230.23369: getting the next task for host managed_node3 11728 1726882230.23378: done getting next task for host managed_node3 11728 1726882230.23381: ^ task is: TASK: Conditional asserts 11728 1726882230.23383: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.23388: getting variables 11728 1726882230.23389: in VariableManager get_vars() 11728 1726882230.23433: Calling all_inventory to load vars for managed_node3 11728 1726882230.23436: Calling groups_inventory to load vars for managed_node3 11728 1726882230.23438: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.23449: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.23451: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.23454: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.24921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.26424: done with get_vars() 11728 1726882230.26454: done getting variables TASK [Conditional asserts] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:42 Friday 20 September 2024 21:30:30 -0400 (0:00:00.415) 0:00:55.117 ****** 11728 1726882230.26558: entering _queue_task() for managed_node3/include_tasks 11728 1726882230.27246: worker is 1 (out of 1 available) 11728 1726882230.27261: exiting _queue_task() for managed_node3/include_tasks 11728 1726882230.27275: done queuing things up, now waiting for results queue to drain 11728 1726882230.27277: waiting for pending results... 11728 1726882230.27816: running TaskExecutor() for managed_node3/TASK: Conditional asserts 11728 1726882230.28048: in run() - task 12673a56-9f93-5c28-a762-0000000008f0 11728 1726882230.28623: variable 'ansible_search_path' from source: unknown 11728 1726882230.28627: variable 'ansible_search_path' from source: unknown 11728 1726882230.29265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882230.32980: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882230.33063: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882230.33172: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882230.33175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882230.33178: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882230.33239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882230.33274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882230.33301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882230.33340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882230.33390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882230.33510: dumping result to json 11728 1726882230.33513: done dumping result, returning 11728 1726882230.33518: done running TaskExecutor() for managed_node3/TASK: Conditional asserts [12673a56-9f93-5c28-a762-0000000008f0] 11728 1726882230.33524: sending task result for task 12673a56-9f93-5c28-a762-0000000008f0 11728 1726882230.33619: done sending task result for task 12673a56-9f93-5c28-a762-0000000008f0 11728 1726882230.33623: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "skipped_reason": "No items in the list" } 11728 1726882230.33665: no more pending results, returning what we have 11728 1726882230.33670: results queue empty 11728 1726882230.33670: checking for any_errors_fatal 11728 1726882230.33679: done checking for any_errors_fatal 11728 1726882230.33680: checking for max_fail_percentage 11728 1726882230.33681: done checking for max_fail_percentage 11728 1726882230.33682: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.33683: done checking to see if all hosts have failed 11728 1726882230.33683: getting the remaining hosts for this loop 11728 1726882230.33685: done getting the remaining hosts for this loop 11728 1726882230.33689: getting the next task for host managed_node3 11728 1726882230.33697: done getting next task for host managed_node3 11728 1726882230.33700: ^ task is: TASK: Success in test '{{ lsr_description }}' 11728 1726882230.33703: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.33707: getting variables 11728 1726882230.33709: in VariableManager get_vars() 11728 1726882230.33750: Calling all_inventory to load vars for managed_node3 11728 1726882230.33753: Calling groups_inventory to load vars for managed_node3 11728 1726882230.33755: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.33765: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.33767: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.33770: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.35363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.37163: done with get_vars() 11728 1726882230.37189: done getting variables 11728 1726882230.37258: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882230.37388: variable 'lsr_description' from source: include params TASK [Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.'] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:47 Friday 20 September 2024 21:30:30 -0400 (0:00:00.108) 0:00:55.226 ****** 11728 1726882230.37424: entering _queue_task() for managed_node3/debug 11728 1726882230.38016: worker is 1 (out of 1 available) 11728 1726882230.38028: exiting _queue_task() for managed_node3/debug 11728 1726882230.38038: done queuing things up, now waiting for results queue to drain 11728 1726882230.38040: waiting for pending results... 11728 1726882230.38315: running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' 11728 1726882230.38320: in run() - task 12673a56-9f93-5c28-a762-0000000008f1 11728 1726882230.38323: variable 'ansible_search_path' from source: unknown 11728 1726882230.38326: variable 'ansible_search_path' from source: unknown 11728 1726882230.38329: calling self._execute() 11728 1726882230.38363: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.38375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.38386: variable 'omit' from source: magic vars 11728 1726882230.38762: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.38774: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.38780: variable 'omit' from source: magic vars 11728 1726882230.38827: variable 'omit' from source: magic vars 11728 1726882230.39099: variable 'lsr_description' from source: include params 11728 1726882230.39103: variable 'omit' from source: magic vars 11728 1726882230.39105: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882230.39108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882230.39111: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882230.39114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882230.39116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882230.39118: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882230.39120: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.39122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.39215: Set connection var ansible_connection to ssh 11728 1726882230.39225: Set connection var ansible_shell_executable to /bin/sh 11728 1726882230.39231: Set connection var ansible_timeout to 10 11728 1726882230.39234: Set connection var ansible_shell_type to sh 11728 1726882230.39247: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882230.39253: Set connection var ansible_pipelining to False 11728 1726882230.39278: variable 'ansible_shell_executable' from source: unknown 11728 1726882230.39281: variable 'ansible_connection' from source: unknown 11728 1726882230.39284: variable 'ansible_module_compression' from source: unknown 11728 1726882230.39288: variable 'ansible_shell_type' from source: unknown 11728 1726882230.39291: variable 'ansible_shell_executable' from source: unknown 11728 1726882230.39294: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.39297: variable 'ansible_pipelining' from source: unknown 11728 1726882230.39302: variable 'ansible_timeout' from source: unknown 11728 1726882230.39307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.39452: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882230.39472: variable 'omit' from source: magic vars 11728 1726882230.39476: starting attempt loop 11728 1726882230.39480: running the handler 11728 1726882230.39529: handler run complete 11728 1726882230.39542: attempt loop complete, returning result 11728 1726882230.39545: _execute() done 11728 1726882230.39547: dumping result to json 11728 1726882230.39550: done dumping result, returning 11728 1726882230.39558: done running TaskExecutor() for managed_node3/TASK: Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' [12673a56-9f93-5c28-a762-0000000008f1] 11728 1726882230.39564: sending task result for task 12673a56-9f93-5c28-a762-0000000008f1 11728 1726882230.39653: done sending task result for task 12673a56-9f93-5c28-a762-0000000008f1 11728 1726882230.39656: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: +++++ Success in test 'Given two DHCP-enabled network interfaces, when creating a bond profile with them, then the controller device and bond port profiles are present and the specified bond options are set for the controller device.' +++++ 11728 1726882230.39729: no more pending results, returning what we have 11728 1726882230.39734: results queue empty 11728 1726882230.39735: checking for any_errors_fatal 11728 1726882230.39742: done checking for any_errors_fatal 11728 1726882230.39743: checking for max_fail_percentage 11728 1726882230.39745: done checking for max_fail_percentage 11728 1726882230.39746: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.39746: done checking to see if all hosts have failed 11728 1726882230.39747: getting the remaining hosts for this loop 11728 1726882230.39749: done getting the remaining hosts for this loop 11728 1726882230.39752: getting the next task for host managed_node3 11728 1726882230.39758: done getting next task for host managed_node3 11728 1726882230.39761: ^ task is: TASK: Cleanup 11728 1726882230.39765: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.39770: getting variables 11728 1726882230.39771: in VariableManager get_vars() 11728 1726882230.39815: Calling all_inventory to load vars for managed_node3 11728 1726882230.39818: Calling groups_inventory to load vars for managed_node3 11728 1726882230.39820: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.39830: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.39832: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.39835: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.41614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.43492: done with get_vars() 11728 1726882230.43517: done getting variables TASK [Cleanup] ***************************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/run_test.yml:66 Friday 20 September 2024 21:30:30 -0400 (0:00:00.061) 0:00:55.288 ****** 11728 1726882230.43615: entering _queue_task() for managed_node3/include_tasks 11728 1726882230.44114: worker is 1 (out of 1 available) 11728 1726882230.44124: exiting _queue_task() for managed_node3/include_tasks 11728 1726882230.44134: done queuing things up, now waiting for results queue to drain 11728 1726882230.44135: waiting for pending results... 11728 1726882230.44313: running TaskExecutor() for managed_node3/TASK: Cleanup 11728 1726882230.44353: in run() - task 12673a56-9f93-5c28-a762-0000000008f5 11728 1726882230.44401: variable 'ansible_search_path' from source: unknown 11728 1726882230.44404: variable 'ansible_search_path' from source: unknown 11728 1726882230.44423: variable 'lsr_cleanup' from source: include params 11728 1726882230.44634: variable 'lsr_cleanup' from source: include params 11728 1726882230.44899: variable 'omit' from source: magic vars 11728 1726882230.44903: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.44905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.44908: variable 'omit' from source: magic vars 11728 1726882230.45135: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.45143: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.45150: variable 'item' from source: unknown 11728 1726882230.45221: variable 'item' from source: unknown 11728 1726882230.45255: variable 'item' from source: unknown 11728 1726882230.45320: variable 'item' from source: unknown 11728 1726882230.45464: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.45468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.45470: variable 'omit' from source: magic vars 11728 1726882230.45584: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.45587: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.45701: variable 'item' from source: unknown 11728 1726882230.45705: variable 'item' from source: unknown 11728 1726882230.45707: variable 'item' from source: unknown 11728 1726882230.45734: variable 'item' from source: unknown 11728 1726882230.45809: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.45815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.45826: variable 'omit' from source: magic vars 11728 1726882230.45986: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.46002: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.46007: variable 'item' from source: unknown 11728 1726882230.46074: variable 'item' from source: unknown 11728 1726882230.46299: variable 'item' from source: unknown 11728 1726882230.46303: variable 'item' from source: unknown 11728 1726882230.46348: dumping result to json 11728 1726882230.46351: done dumping result, returning 11728 1726882230.46354: done running TaskExecutor() for managed_node3/TASK: Cleanup [12673a56-9f93-5c28-a762-0000000008f5] 11728 1726882230.46357: sending task result for task 12673a56-9f93-5c28-a762-0000000008f5 11728 1726882230.46391: done sending task result for task 12673a56-9f93-5c28-a762-0000000008f5 11728 1726882230.46396: WORKER PROCESS EXITING 11728 1726882230.46431: no more pending results, returning what we have 11728 1726882230.46437: in VariableManager get_vars() 11728 1726882230.46571: Calling all_inventory to load vars for managed_node3 11728 1726882230.46575: Calling groups_inventory to load vars for managed_node3 11728 1726882230.46577: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.46591: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.46598: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.46602: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.48297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.49330: done with get_vars() 11728 1726882230.49344: variable 'ansible_search_path' from source: unknown 11728 1726882230.49345: variable 'ansible_search_path' from source: unknown 11728 1726882230.49374: variable 'ansible_search_path' from source: unknown 11728 1726882230.49375: variable 'ansible_search_path' from source: unknown 11728 1726882230.49394: variable 'ansible_search_path' from source: unknown 11728 1726882230.49396: variable 'ansible_search_path' from source: unknown 11728 1726882230.49413: we have included files to process 11728 1726882230.49414: generating all_blocks data 11728 1726882230.49415: done generating all_blocks data 11728 1726882230.49419: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882230.49419: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882230.49421: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml 11728 1726882230.49521: in VariableManager get_vars() 11728 1726882230.49538: done with get_vars() 11728 1726882230.49541: variable 'omit' from source: magic vars 11728 1726882230.49565: variable 'omit' from source: magic vars 11728 1726882230.49602: in VariableManager get_vars() 11728 1726882230.49614: done with get_vars() 11728 1726882230.49633: in VariableManager get_vars() 11728 1726882230.49646: done with get_vars() 11728 1726882230.49671: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11728 1726882230.49741: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11728 1726882230.49824: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11728 1726882230.50046: in VariableManager get_vars() 11728 1726882230.50062: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882230.51814: done processing included file 11728 1726882230.51816: iterating over new_blocks loaded from include file 11728 1726882230.51817: in VariableManager get_vars() 11728 1726882230.51832: done with get_vars() 11728 1726882230.51833: filtering new block on tags 11728 1726882230.52037: done filtering new block on tags 11728 1726882230.52040: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml for managed_node3 => (item=tasks/cleanup_bond_profile+device.yml) 11728 1726882230.52045: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882230.52045: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882230.52047: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11728 1726882230.52343: done processing included file 11728 1726882230.52345: iterating over new_blocks loaded from include file 11728 1726882230.52346: in VariableManager get_vars() 11728 1726882230.52366: done with get_vars() 11728 1726882230.52368: filtering new block on tags 11728 1726882230.52482: done filtering new block on tags 11728 1726882230.52484: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml for managed_node3 => (item=tasks/remove_test_interfaces_with_dhcp.yml) 11728 1726882230.52488: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11728 1726882230.52497: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11728 1726882230.52500: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11728 1726882230.52860: done processing included file 11728 1726882230.52862: iterating over new_blocks loaded from include file 11728 1726882230.52864: in VariableManager get_vars() 11728 1726882230.52883: done with get_vars() 11728 1726882230.52885: filtering new block on tags 11728 1726882230.52917: done filtering new block on tags 11728 1726882230.52919: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 => (item=tasks/check_network_dns.yml) 11728 1726882230.52922: extending task lists for all hosts with included blocks 11728 1726882230.63826: done extending task lists 11728 1726882230.63828: done processing included files 11728 1726882230.63829: results queue empty 11728 1726882230.63829: checking for any_errors_fatal 11728 1726882230.63833: done checking for any_errors_fatal 11728 1726882230.63833: checking for max_fail_percentage 11728 1726882230.63834: done checking for max_fail_percentage 11728 1726882230.63835: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.63836: done checking to see if all hosts have failed 11728 1726882230.63837: getting the remaining hosts for this loop 11728 1726882230.63838: done getting the remaining hosts for this loop 11728 1726882230.63840: getting the next task for host managed_node3 11728 1726882230.63844: done getting next task for host managed_node3 11728 1726882230.63847: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882230.63850: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.63860: getting variables 11728 1726882230.63861: in VariableManager get_vars() 11728 1726882230.63882: Calling all_inventory to load vars for managed_node3 11728 1726882230.63884: Calling groups_inventory to load vars for managed_node3 11728 1726882230.63886: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.63891: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.63897: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.63899: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.65144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.66729: done with get_vars() 11728 1726882230.66758: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:30:30 -0400 (0:00:00.232) 0:00:55.520 ****** 11728 1726882230.66851: entering _queue_task() for managed_node3/include_tasks 11728 1726882230.67230: worker is 1 (out of 1 available) 11728 1726882230.67242: exiting _queue_task() for managed_node3/include_tasks 11728 1726882230.67254: done queuing things up, now waiting for results queue to drain 11728 1726882230.67255: waiting for pending results... 11728 1726882230.67613: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11728 1726882230.67710: in run() - task 12673a56-9f93-5c28-a762-000000000e0a 11728 1726882230.67727: variable 'ansible_search_path' from source: unknown 11728 1726882230.67731: variable 'ansible_search_path' from source: unknown 11728 1726882230.67772: calling self._execute() 11728 1726882230.67878: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.67882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.68098: variable 'omit' from source: magic vars 11728 1726882230.68302: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.68314: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.68320: _execute() done 11728 1726882230.68324: dumping result to json 11728 1726882230.68327: done dumping result, returning 11728 1726882230.68333: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-5c28-a762-000000000e0a] 11728 1726882230.68339: sending task result for task 12673a56-9f93-5c28-a762-000000000e0a 11728 1726882230.68435: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0a 11728 1726882230.68439: WORKER PROCESS EXITING 11728 1726882230.68496: no more pending results, returning what we have 11728 1726882230.68502: in VariableManager get_vars() 11728 1726882230.68557: Calling all_inventory to load vars for managed_node3 11728 1726882230.68560: Calling groups_inventory to load vars for managed_node3 11728 1726882230.68563: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.68575: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.68578: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.68581: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.70291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.71912: done with get_vars() 11728 1726882230.71933: variable 'ansible_search_path' from source: unknown 11728 1726882230.71934: variable 'ansible_search_path' from source: unknown 11728 1726882230.71981: we have included files to process 11728 1726882230.71982: generating all_blocks data 11728 1726882230.71984: done generating all_blocks data 11728 1726882230.71986: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882230.71987: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882230.71989: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11728 1726882230.72581: done processing included file 11728 1726882230.72583: iterating over new_blocks loaded from include file 11728 1726882230.72584: in VariableManager get_vars() 11728 1726882230.72621: done with get_vars() 11728 1726882230.72624: filtering new block on tags 11728 1726882230.72658: done filtering new block on tags 11728 1726882230.72661: in VariableManager get_vars() 11728 1726882230.72690: done with get_vars() 11728 1726882230.72691: filtering new block on tags 11728 1726882230.72746: done filtering new block on tags 11728 1726882230.72749: in VariableManager get_vars() 11728 1726882230.72777: done with get_vars() 11728 1726882230.72778: filtering new block on tags 11728 1726882230.72831: done filtering new block on tags 11728 1726882230.72833: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11728 1726882230.72839: extending task lists for all hosts with included blocks 11728 1726882230.74600: done extending task lists 11728 1726882230.74602: done processing included files 11728 1726882230.74602: results queue empty 11728 1726882230.74603: checking for any_errors_fatal 11728 1726882230.74607: done checking for any_errors_fatal 11728 1726882230.74608: checking for max_fail_percentage 11728 1726882230.74609: done checking for max_fail_percentage 11728 1726882230.74610: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.74611: done checking to see if all hosts have failed 11728 1726882230.74611: getting the remaining hosts for this loop 11728 1726882230.74613: done getting the remaining hosts for this loop 11728 1726882230.74615: getting the next task for host managed_node3 11728 1726882230.74620: done getting next task for host managed_node3 11728 1726882230.74622: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882230.74627: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.74638: getting variables 11728 1726882230.74640: in VariableManager get_vars() 11728 1726882230.74662: Calling all_inventory to load vars for managed_node3 11728 1726882230.74665: Calling groups_inventory to load vars for managed_node3 11728 1726882230.74667: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.74672: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.74675: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.74677: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.75817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.77425: done with get_vars() 11728 1726882230.77447: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:30:30 -0400 (0:00:00.106) 0:00:55.627 ****** 11728 1726882230.77537: entering _queue_task() for managed_node3/setup 11728 1726882230.77888: worker is 1 (out of 1 available) 11728 1726882230.77902: exiting _queue_task() for managed_node3/setup 11728 1726882230.77914: done queuing things up, now waiting for results queue to drain 11728 1726882230.77915: waiting for pending results... 11728 1726882230.78227: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11728 1726882230.78409: in run() - task 12673a56-9f93-5c28-a762-000000000fde 11728 1726882230.78426: variable 'ansible_search_path' from source: unknown 11728 1726882230.78430: variable 'ansible_search_path' from source: unknown 11728 1726882230.78462: calling self._execute() 11728 1726882230.78558: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.78562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.78571: variable 'omit' from source: magic vars 11728 1726882230.78963: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.78974: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.79192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882230.81558: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882230.81584: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882230.81635: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882230.81668: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882230.81698: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882230.81783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882230.81817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882230.81842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882230.81976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882230.81979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882230.81981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882230.81990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882230.82018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882230.82057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882230.82079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882230.82241: variable '__network_required_facts' from source: role '' defaults 11728 1726882230.82249: variable 'ansible_facts' from source: unknown 11728 1726882230.82996: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11728 1726882230.83000: when evaluation is False, skipping this task 11728 1726882230.83003: _execute() done 11728 1726882230.83005: dumping result to json 11728 1726882230.83008: done dumping result, returning 11728 1726882230.83018: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-5c28-a762-000000000fde] 11728 1726882230.83023: sending task result for task 12673a56-9f93-5c28-a762-000000000fde 11728 1726882230.83124: done sending task result for task 12673a56-9f93-5c28-a762-000000000fde 11728 1726882230.83128: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882230.83190: no more pending results, returning what we have 11728 1726882230.83198: results queue empty 11728 1726882230.83199: checking for any_errors_fatal 11728 1726882230.83200: done checking for any_errors_fatal 11728 1726882230.83201: checking for max_fail_percentage 11728 1726882230.83203: done checking for max_fail_percentage 11728 1726882230.83204: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.83205: done checking to see if all hosts have failed 11728 1726882230.83205: getting the remaining hosts for this loop 11728 1726882230.83207: done getting the remaining hosts for this loop 11728 1726882230.83211: getting the next task for host managed_node3 11728 1726882230.83223: done getting next task for host managed_node3 11728 1726882230.83226: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882230.83233: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.83256: getting variables 11728 1726882230.83257: in VariableManager get_vars() 11728 1726882230.83307: Calling all_inventory to load vars for managed_node3 11728 1726882230.83310: Calling groups_inventory to load vars for managed_node3 11728 1726882230.83313: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.83323: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.83326: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.83334: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.85005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.86622: done with get_vars() 11728 1726882230.86649: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:30:30 -0400 (0:00:00.092) 0:00:55.720 ****** 11728 1726882230.86759: entering _queue_task() for managed_node3/stat 11728 1726882230.87319: worker is 1 (out of 1 available) 11728 1726882230.87329: exiting _queue_task() for managed_node3/stat 11728 1726882230.87338: done queuing things up, now waiting for results queue to drain 11728 1726882230.87340: waiting for pending results... 11728 1726882230.87511: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11728 1726882230.87587: in run() - task 12673a56-9f93-5c28-a762-000000000fe0 11728 1726882230.87605: variable 'ansible_search_path' from source: unknown 11728 1726882230.87609: variable 'ansible_search_path' from source: unknown 11728 1726882230.87643: calling self._execute() 11728 1726882230.87743: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.87746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.87758: variable 'omit' from source: magic vars 11728 1726882230.88156: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.88168: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.88346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882230.88632: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882230.88681: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882230.88719: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882230.88752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882230.88882: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882230.88912: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882230.88937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882230.88962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882230.89061: variable '__network_is_ostree' from source: set_fact 11728 1726882230.89067: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882230.89070: when evaluation is False, skipping this task 11728 1726882230.89073: _execute() done 11728 1726882230.89076: dumping result to json 11728 1726882230.89097: done dumping result, returning 11728 1726882230.89101: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-5c28-a762-000000000fe0] 11728 1726882230.89104: sending task result for task 12673a56-9f93-5c28-a762-000000000fe0 skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882230.89255: no more pending results, returning what we have 11728 1726882230.89259: results queue empty 11728 1726882230.89261: checking for any_errors_fatal 11728 1726882230.89271: done checking for any_errors_fatal 11728 1726882230.89272: checking for max_fail_percentage 11728 1726882230.89274: done checking for max_fail_percentage 11728 1726882230.89275: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.89275: done checking to see if all hosts have failed 11728 1726882230.89276: getting the remaining hosts for this loop 11728 1726882230.89278: done getting the remaining hosts for this loop 11728 1726882230.89282: getting the next task for host managed_node3 11728 1726882230.89291: done getting next task for host managed_node3 11728 1726882230.89298: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882230.89307: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.89332: getting variables 11728 1726882230.89334: in VariableManager get_vars() 11728 1726882230.89382: Calling all_inventory to load vars for managed_node3 11728 1726882230.89385: Calling groups_inventory to load vars for managed_node3 11728 1726882230.89388: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.89406: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.89410: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.89414: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.89932: done sending task result for task 12673a56-9f93-5c28-a762-000000000fe0 11728 1726882230.89936: WORKER PROCESS EXITING 11728 1726882230.90987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.92759: done with get_vars() 11728 1726882230.92780: done getting variables 11728 1726882230.92845: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:30:30 -0400 (0:00:00.061) 0:00:55.781 ****** 11728 1726882230.92891: entering _queue_task() for managed_node3/set_fact 11728 1726882230.93311: worker is 1 (out of 1 available) 11728 1726882230.93322: exiting _queue_task() for managed_node3/set_fact 11728 1726882230.93334: done queuing things up, now waiting for results queue to drain 11728 1726882230.93336: waiting for pending results... 11728 1726882230.93630: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11728 1726882230.93823: in run() - task 12673a56-9f93-5c28-a762-000000000fe1 11728 1726882230.93834: variable 'ansible_search_path' from source: unknown 11728 1726882230.93838: variable 'ansible_search_path' from source: unknown 11728 1726882230.93868: calling self._execute() 11728 1726882230.93945: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.93950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.93960: variable 'omit' from source: magic vars 11728 1726882230.94244: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.94252: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.94369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882230.94565: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882230.94604: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882230.94631: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882230.94656: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882230.94748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882230.94765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882230.94783: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882230.94807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882230.94874: variable '__network_is_ostree' from source: set_fact 11728 1726882230.94879: Evaluated conditional (not __network_is_ostree is defined): False 11728 1726882230.94882: when evaluation is False, skipping this task 11728 1726882230.94885: _execute() done 11728 1726882230.94889: dumping result to json 11728 1726882230.94892: done dumping result, returning 11728 1726882230.94906: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-5c28-a762-000000000fe1] 11728 1726882230.94908: sending task result for task 12673a56-9f93-5c28-a762-000000000fe1 11728 1726882230.94989: done sending task result for task 12673a56-9f93-5c28-a762-000000000fe1 11728 1726882230.94992: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11728 1726882230.95059: no more pending results, returning what we have 11728 1726882230.95063: results queue empty 11728 1726882230.95064: checking for any_errors_fatal 11728 1726882230.95069: done checking for any_errors_fatal 11728 1726882230.95070: checking for max_fail_percentage 11728 1726882230.95071: done checking for max_fail_percentage 11728 1726882230.95072: checking to see if all hosts have failed and the running result is not ok 11728 1726882230.95073: done checking to see if all hosts have failed 11728 1726882230.95074: getting the remaining hosts for this loop 11728 1726882230.95075: done getting the remaining hosts for this loop 11728 1726882230.95079: getting the next task for host managed_node3 11728 1726882230.95089: done getting next task for host managed_node3 11728 1726882230.95096: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882230.95103: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882230.95124: getting variables 11728 1726882230.95126: in VariableManager get_vars() 11728 1726882230.95164: Calling all_inventory to load vars for managed_node3 11728 1726882230.95167: Calling groups_inventory to load vars for managed_node3 11728 1726882230.95169: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882230.95177: Calling all_plugins_play to load vars for managed_node3 11728 1726882230.95179: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882230.95181: Calling groups_plugins_play to load vars for managed_node3 11728 1726882230.96409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882230.97347: done with get_vars() 11728 1726882230.97368: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:30:30 -0400 (0:00:00.045) 0:00:55.826 ****** 11728 1726882230.97448: entering _queue_task() for managed_node3/service_facts 11728 1726882230.97716: worker is 1 (out of 1 available) 11728 1726882230.97729: exiting _queue_task() for managed_node3/service_facts 11728 1726882230.97742: done queuing things up, now waiting for results queue to drain 11728 1726882230.97743: waiting for pending results... 11728 1726882230.97933: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11728 1726882230.98047: in run() - task 12673a56-9f93-5c28-a762-000000000fe3 11728 1726882230.98059: variable 'ansible_search_path' from source: unknown 11728 1726882230.98063: variable 'ansible_search_path' from source: unknown 11728 1726882230.98096: calling self._execute() 11728 1726882230.98166: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.98171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.98181: variable 'omit' from source: magic vars 11728 1726882230.98550: variable 'ansible_distribution_major_version' from source: facts 11728 1726882230.98554: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882230.98557: variable 'omit' from source: magic vars 11728 1726882230.98818: variable 'omit' from source: magic vars 11728 1726882230.98822: variable 'omit' from source: magic vars 11728 1726882230.98825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882230.98827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882230.98830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882230.98832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882230.98834: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882230.98836: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882230.98838: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.98840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.98912: Set connection var ansible_connection to ssh 11728 1726882230.98923: Set connection var ansible_shell_executable to /bin/sh 11728 1726882230.98928: Set connection var ansible_timeout to 10 11728 1726882230.98931: Set connection var ansible_shell_type to sh 11728 1726882230.98939: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882230.98944: Set connection var ansible_pipelining to False 11728 1726882230.98967: variable 'ansible_shell_executable' from source: unknown 11728 1726882230.98970: variable 'ansible_connection' from source: unknown 11728 1726882230.98973: variable 'ansible_module_compression' from source: unknown 11728 1726882230.98976: variable 'ansible_shell_type' from source: unknown 11728 1726882230.98978: variable 'ansible_shell_executable' from source: unknown 11728 1726882230.98980: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882230.98982: variable 'ansible_pipelining' from source: unknown 11728 1726882230.98984: variable 'ansible_timeout' from source: unknown 11728 1726882230.99069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882230.99183: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882230.99197: variable 'omit' from source: magic vars 11728 1726882230.99204: starting attempt loop 11728 1726882230.99207: running the handler 11728 1726882230.99222: _low_level_execute_command(): starting 11728 1726882230.99231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882230.99940: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882230.99952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882230.99964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882230.99979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882230.99992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882231.00006: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882231.00017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882231.00032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882231.00048: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882231.00051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882231.00056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882231.00067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882231.00079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882231.00101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882231.00104: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882231.00110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882231.00237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882231.00240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882231.00243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882231.00326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882231.01987: stdout chunk (state=3): >>>/root <<< 11728 1726882231.02148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882231.02151: stdout chunk (state=3): >>><<< 11728 1726882231.02153: stderr chunk (state=3): >>><<< 11728 1726882231.02276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882231.02280: _low_level_execute_command(): starting 11728 1726882231.02284: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384 `" && echo ansible-tmp-1726882231.0217931-14488-143397476323384="` echo /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384 `" ) && sleep 0' 11728 1726882231.02913: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882231.02990: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882231.03014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882231.03036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882231.03115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882231.05008: stdout chunk (state=3): >>>ansible-tmp-1726882231.0217931-14488-143397476323384=/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384 <<< 11728 1726882231.05166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882231.05169: stdout chunk (state=3): >>><<< 11728 1726882231.05171: stderr chunk (state=3): >>><<< 11728 1726882231.05401: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882231.0217931-14488-143397476323384=/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882231.05405: variable 'ansible_module_compression' from source: unknown 11728 1726882231.05407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11728 1726882231.05409: variable 'ansible_facts' from source: unknown 11728 1726882231.05431: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py 11728 1726882231.05660: Sending initial data 11728 1726882231.05663: Sent initial data (162 bytes) 11728 1726882231.06183: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882231.06192: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882231.06296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882231.06341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882231.06380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882231.07930: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882231.07977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882231.08040: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpl5y1m__p /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py <<< 11728 1726882231.08049: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py" <<< 11728 1726882231.08090: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpl5y1m__p" to remote "/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py" <<< 11728 1726882231.08935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882231.08948: stderr chunk (state=3): >>><<< 11728 1726882231.09067: stdout chunk (state=3): >>><<< 11728 1726882231.09071: done transferring module to remote 11728 1726882231.09074: _low_level_execute_command(): starting 11728 1726882231.09076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/ /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py && sleep 0' 11728 1726882231.09725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882231.09744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882231.09760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882231.09844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882231.09891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882231.09914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882231.09940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882231.10024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882231.11829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882231.11833: stdout chunk (state=3): >>><<< 11728 1726882231.11835: stderr chunk (state=3): >>><<< 11728 1726882231.11901: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882231.11904: _low_level_execute_command(): starting 11728 1726882231.11907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/AnsiballZ_service_facts.py && sleep 0' 11728 1726882231.12507: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882231.12522: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882231.12543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882231.12566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882231.12582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882231.12599: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882231.12667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882231.12716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882231.12734: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882231.12785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882231.12840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.64544: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 11728 1726882232.64560: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11728 1726882232.66100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882232.66104: stdout chunk (state=3): >>><<< 11728 1726882232.66107: stderr chunk (state=3): >>><<< 11728 1726882232.66111: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882232.67289: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882232.67299: _low_level_execute_command(): starting 11728 1726882232.67306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882231.0217931-14488-143397476323384/ > /dev/null 2>&1 && sleep 0' 11728 1726882232.67901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882232.67912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882232.67923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882232.67936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882232.68008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882232.68011: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882232.68014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882232.68016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882232.68018: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882232.68021: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882232.68023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882232.68025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882232.68027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882232.68030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882232.68035: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882232.68044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882232.68114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882232.68128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.68153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.68206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.70008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882232.70102: stderr chunk (state=3): >>><<< 11728 1726882232.70105: stdout chunk (state=3): >>><<< 11728 1726882232.70107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882232.70109: handler run complete 11728 1726882232.70287: variable 'ansible_facts' from source: unknown 11728 1726882232.70449: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882232.70983: variable 'ansible_facts' from source: unknown 11728 1726882232.71125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882232.71400: attempt loop complete, returning result 11728 1726882232.71404: _execute() done 11728 1726882232.71406: dumping result to json 11728 1726882232.71441: done dumping result, returning 11728 1726882232.71454: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-5c28-a762-000000000fe3] 11728 1726882232.71463: sending task result for task 12673a56-9f93-5c28-a762-000000000fe3 11728 1726882232.72847: done sending task result for task 12673a56-9f93-5c28-a762-000000000fe3 11728 1726882232.72850: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882232.72977: no more pending results, returning what we have 11728 1726882232.72980: results queue empty 11728 1726882232.72981: checking for any_errors_fatal 11728 1726882232.72984: done checking for any_errors_fatal 11728 1726882232.72985: checking for max_fail_percentage 11728 1726882232.72986: done checking for max_fail_percentage 11728 1726882232.72987: checking to see if all hosts have failed and the running result is not ok 11728 1726882232.72987: done checking to see if all hosts have failed 11728 1726882232.72988: getting the remaining hosts for this loop 11728 1726882232.72989: done getting the remaining hosts for this loop 11728 1726882232.72997: getting the next task for host managed_node3 11728 1726882232.73003: done getting next task for host managed_node3 11728 1726882232.73006: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882232.73013: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882232.73028: getting variables 11728 1726882232.73029: in VariableManager get_vars() 11728 1726882232.73067: Calling all_inventory to load vars for managed_node3 11728 1726882232.73069: Calling groups_inventory to load vars for managed_node3 11728 1726882232.73072: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882232.73080: Calling all_plugins_play to load vars for managed_node3 11728 1726882232.73082: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882232.73085: Calling groups_plugins_play to load vars for managed_node3 11728 1726882232.74316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882232.76003: done with get_vars() 11728 1726882232.76025: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:30:32 -0400 (0:00:01.786) 0:00:57.613 ****** 11728 1726882232.76127: entering _queue_task() for managed_node3/package_facts 11728 1726882232.76624: worker is 1 (out of 1 available) 11728 1726882232.76636: exiting _queue_task() for managed_node3/package_facts 11728 1726882232.76648: done queuing things up, now waiting for results queue to drain 11728 1726882232.76649: waiting for pending results... 11728 1726882232.76890: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11728 1726882232.76989: in run() - task 12673a56-9f93-5c28-a762-000000000fe4 11728 1726882232.77017: variable 'ansible_search_path' from source: unknown 11728 1726882232.77026: variable 'ansible_search_path' from source: unknown 11728 1726882232.77066: calling self._execute() 11728 1726882232.77170: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882232.77182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882232.77207: variable 'omit' from source: magic vars 11728 1726882232.77602: variable 'ansible_distribution_major_version' from source: facts 11728 1726882232.77621: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882232.77639: variable 'omit' from source: magic vars 11728 1726882232.77735: variable 'omit' from source: magic vars 11728 1726882232.77779: variable 'omit' from source: magic vars 11728 1726882232.77826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882232.77900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882232.77904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882232.77924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882232.77942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882232.77983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882232.78074: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882232.78078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882232.78113: Set connection var ansible_connection to ssh 11728 1726882232.78131: Set connection var ansible_shell_executable to /bin/sh 11728 1726882232.78145: Set connection var ansible_timeout to 10 11728 1726882232.78153: Set connection var ansible_shell_type to sh 11728 1726882232.78166: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882232.78184: Set connection var ansible_pipelining to False 11728 1726882232.78222: variable 'ansible_shell_executable' from source: unknown 11728 1726882232.78232: variable 'ansible_connection' from source: unknown 11728 1726882232.78241: variable 'ansible_module_compression' from source: unknown 11728 1726882232.78250: variable 'ansible_shell_type' from source: unknown 11728 1726882232.78260: variable 'ansible_shell_executable' from source: unknown 11728 1726882232.78268: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882232.78276: variable 'ansible_pipelining' from source: unknown 11728 1726882232.78285: variable 'ansible_timeout' from source: unknown 11728 1726882232.78303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882232.78512: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882232.78522: variable 'omit' from source: magic vars 11728 1726882232.78600: starting attempt loop 11728 1726882232.78604: running the handler 11728 1726882232.78606: _low_level_execute_command(): starting 11728 1726882232.78609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882232.79365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882232.79389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882232.79503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.79526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.79613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.81222: stdout chunk (state=3): >>>/root <<< 11728 1726882232.81398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882232.81402: stdout chunk (state=3): >>><<< 11728 1726882232.81405: stderr chunk (state=3): >>><<< 11728 1726882232.81539: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882232.81544: _low_level_execute_command(): starting 11728 1726882232.81547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917 `" && echo ansible-tmp-1726882232.81433-14545-83570880521917="` echo /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917 `" ) && sleep 0' 11728 1726882232.82132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882232.82145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882232.82213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882232.82281: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882232.82302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.82323: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.82403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.84237: stdout chunk (state=3): >>>ansible-tmp-1726882232.81433-14545-83570880521917=/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917 <<< 11728 1726882232.84399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882232.84402: stdout chunk (state=3): >>><<< 11728 1726882232.84405: stderr chunk (state=3): >>><<< 11728 1726882232.84423: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882232.81433-14545-83570880521917=/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882232.84475: variable 'ansible_module_compression' from source: unknown 11728 1726882232.84581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11728 1726882232.84872: variable 'ansible_facts' from source: unknown 11728 1726882232.85064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py 11728 1726882232.85534: Sending initial data 11728 1726882232.85538: Sent initial data (159 bytes) 11728 1726882232.86586: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882232.86630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.86660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.86740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.88273: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882232.88349: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882232.88429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpk5mvt8bx /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py <<< 11728 1726882232.88433: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py" <<< 11728 1726882232.88488: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpk5mvt8bx" to remote "/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py" <<< 11728 1726882232.89788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882232.89909: stderr chunk (state=3): >>><<< 11728 1726882232.89913: stdout chunk (state=3): >>><<< 11728 1726882232.89916: done transferring module to remote 11728 1726882232.89918: _low_level_execute_command(): starting 11728 1726882232.89920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/ /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py && sleep 0' 11728 1726882232.90509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882232.90612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882232.90651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882232.90664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.90684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.90766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882232.92569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882232.92581: stdout chunk (state=3): >>><<< 11728 1726882232.92597: stderr chunk (state=3): >>><<< 11728 1726882232.92772: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882232.92775: _low_level_execute_command(): starting 11728 1726882232.92778: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/AnsiballZ_package_facts.py && sleep 0' 11728 1726882232.93354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882232.93370: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882232.93385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882232.93413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882232.93507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882232.93534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882232.93628: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882233.38468: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11728 1726882233.38487: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11728 1726882233.38501: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11728 1726882233.38586: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11728 1726882233.40603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882233.40606: stdout chunk (state=3): >>><<< 11728 1726882233.40609: stderr chunk (state=3): >>><<< 11728 1726882233.40707: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882233.45144: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882233.45400: _low_level_execute_command(): starting 11728 1726882233.45481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882232.81433-14545-83570880521917/ > /dev/null 2>&1 && sleep 0' 11728 1726882233.46474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882233.46584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882233.46614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882233.46710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882233.48556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882233.48801: stderr chunk (state=3): >>><<< 11728 1726882233.48804: stdout chunk (state=3): >>><<< 11728 1726882233.48807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882233.48809: handler run complete 11728 1726882233.49869: variable 'ansible_facts' from source: unknown 11728 1726882233.50422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.52411: variable 'ansible_facts' from source: unknown 11728 1726882233.52864: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.53523: attempt loop complete, returning result 11728 1726882233.53700: _execute() done 11728 1726882233.53704: dumping result to json 11728 1726882233.53755: done dumping result, returning 11728 1726882233.53766: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-5c28-a762-000000000fe4] 11728 1726882233.53771: sending task result for task 12673a56-9f93-5c28-a762-000000000fe4 11728 1726882233.56183: done sending task result for task 12673a56-9f93-5c28-a762-000000000fe4 11728 1726882233.56187: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882233.56347: no more pending results, returning what we have 11728 1726882233.56350: results queue empty 11728 1726882233.56351: checking for any_errors_fatal 11728 1726882233.56355: done checking for any_errors_fatal 11728 1726882233.56356: checking for max_fail_percentage 11728 1726882233.56357: done checking for max_fail_percentage 11728 1726882233.56358: checking to see if all hosts have failed and the running result is not ok 11728 1726882233.56359: done checking to see if all hosts have failed 11728 1726882233.56359: getting the remaining hosts for this loop 11728 1726882233.56361: done getting the remaining hosts for this loop 11728 1726882233.56364: getting the next task for host managed_node3 11728 1726882233.56371: done getting next task for host managed_node3 11728 1726882233.56374: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882233.56379: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882233.56391: getting variables 11728 1726882233.56392: in VariableManager get_vars() 11728 1726882233.56437: Calling all_inventory to load vars for managed_node3 11728 1726882233.56440: Calling groups_inventory to load vars for managed_node3 11728 1726882233.56442: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882233.56450: Calling all_plugins_play to load vars for managed_node3 11728 1726882233.56452: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882233.56455: Calling groups_plugins_play to load vars for managed_node3 11728 1726882233.57684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.59292: done with get_vars() 11728 1726882233.59331: done getting variables 11728 1726882233.59392: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:30:33 -0400 (0:00:00.833) 0:00:58.446 ****** 11728 1726882233.59446: entering _queue_task() for managed_node3/debug 11728 1726882233.59826: worker is 1 (out of 1 available) 11728 1726882233.59839: exiting _queue_task() for managed_node3/debug 11728 1726882233.59852: done queuing things up, now waiting for results queue to drain 11728 1726882233.59853: waiting for pending results... 11728 1726882233.60223: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11728 1726882233.60332: in run() - task 12673a56-9f93-5c28-a762-000000000e0b 11728 1726882233.60356: variable 'ansible_search_path' from source: unknown 11728 1726882233.60364: variable 'ansible_search_path' from source: unknown 11728 1726882233.60407: calling self._execute() 11728 1726882233.60505: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.60518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.60540: variable 'omit' from source: magic vars 11728 1726882233.60921: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.60973: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882233.60976: variable 'omit' from source: magic vars 11728 1726882233.61022: variable 'omit' from source: magic vars 11728 1726882233.61128: variable 'network_provider' from source: set_fact 11728 1726882233.61151: variable 'omit' from source: magic vars 11728 1726882233.61205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882233.61247: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882233.61301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882233.61304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882233.61311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882233.61344: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882233.61353: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.61361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.61518: Set connection var ansible_connection to ssh 11728 1726882233.61522: Set connection var ansible_shell_executable to /bin/sh 11728 1726882233.61525: Set connection var ansible_timeout to 10 11728 1726882233.61527: Set connection var ansible_shell_type to sh 11728 1726882233.61529: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882233.61531: Set connection var ansible_pipelining to False 11728 1726882233.61537: variable 'ansible_shell_executable' from source: unknown 11728 1726882233.61546: variable 'ansible_connection' from source: unknown 11728 1726882233.61552: variable 'ansible_module_compression' from source: unknown 11728 1726882233.61558: variable 'ansible_shell_type' from source: unknown 11728 1726882233.61563: variable 'ansible_shell_executable' from source: unknown 11728 1726882233.61569: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.61577: variable 'ansible_pipelining' from source: unknown 11728 1726882233.61585: variable 'ansible_timeout' from source: unknown 11728 1726882233.61596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.61900: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882233.61904: variable 'omit' from source: magic vars 11728 1726882233.61906: starting attempt loop 11728 1726882233.61909: running the handler 11728 1726882233.61911: handler run complete 11728 1726882233.61913: attempt loop complete, returning result 11728 1726882233.61915: _execute() done 11728 1726882233.61918: dumping result to json 11728 1726882233.61920: done dumping result, returning 11728 1726882233.61922: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-5c28-a762-000000000e0b] 11728 1726882233.61925: sending task result for task 12673a56-9f93-5c28-a762-000000000e0b 11728 1726882233.61987: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0b 11728 1726882233.61990: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11728 1726882233.62058: no more pending results, returning what we have 11728 1726882233.62062: results queue empty 11728 1726882233.62063: checking for any_errors_fatal 11728 1726882233.62073: done checking for any_errors_fatal 11728 1726882233.62074: checking for max_fail_percentage 11728 1726882233.62076: done checking for max_fail_percentage 11728 1726882233.62077: checking to see if all hosts have failed and the running result is not ok 11728 1726882233.62078: done checking to see if all hosts have failed 11728 1726882233.62079: getting the remaining hosts for this loop 11728 1726882233.62080: done getting the remaining hosts for this loop 11728 1726882233.62084: getting the next task for host managed_node3 11728 1726882233.62092: done getting next task for host managed_node3 11728 1726882233.62098: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882233.62104: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882233.62117: getting variables 11728 1726882233.62119: in VariableManager get_vars() 11728 1726882233.62164: Calling all_inventory to load vars for managed_node3 11728 1726882233.62167: Calling groups_inventory to load vars for managed_node3 11728 1726882233.62170: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882233.62180: Calling all_plugins_play to load vars for managed_node3 11728 1726882233.62183: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882233.62185: Calling groups_plugins_play to load vars for managed_node3 11728 1726882233.63938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.65632: done with get_vars() 11728 1726882233.65657: done getting variables 11728 1726882233.65724: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:30:33 -0400 (0:00:00.063) 0:00:58.510 ****** 11728 1726882233.65767: entering _queue_task() for managed_node3/fail 11728 1726882233.66308: worker is 1 (out of 1 available) 11728 1726882233.66319: exiting _queue_task() for managed_node3/fail 11728 1726882233.66331: done queuing things up, now waiting for results queue to drain 11728 1726882233.66332: waiting for pending results... 11728 1726882233.66513: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11728 1726882233.66668: in run() - task 12673a56-9f93-5c28-a762-000000000e0c 11728 1726882233.66673: variable 'ansible_search_path' from source: unknown 11728 1726882233.66676: variable 'ansible_search_path' from source: unknown 11728 1726882233.66698: calling self._execute() 11728 1726882233.66807: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.66819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.66886: variable 'omit' from source: magic vars 11728 1726882233.67498: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.67536: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882233.67784: variable 'network_state' from source: role '' defaults 11728 1726882233.67899: Evaluated conditional (network_state != {}): False 11728 1726882233.67905: when evaluation is False, skipping this task 11728 1726882233.67908: _execute() done 11728 1726882233.67911: dumping result to json 11728 1726882233.67914: done dumping result, returning 11728 1726882233.67916: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-5c28-a762-000000000e0c] 11728 1726882233.67919: sending task result for task 12673a56-9f93-5c28-a762-000000000e0c 11728 1726882233.68328: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0c 11728 1726882233.68331: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882233.68379: no more pending results, returning what we have 11728 1726882233.68384: results queue empty 11728 1726882233.68386: checking for any_errors_fatal 11728 1726882233.68398: done checking for any_errors_fatal 11728 1726882233.68399: checking for max_fail_percentage 11728 1726882233.68402: done checking for max_fail_percentage 11728 1726882233.68403: checking to see if all hosts have failed and the running result is not ok 11728 1726882233.68404: done checking to see if all hosts have failed 11728 1726882233.68405: getting the remaining hosts for this loop 11728 1726882233.68407: done getting the remaining hosts for this loop 11728 1726882233.68410: getting the next task for host managed_node3 11728 1726882233.68418: done getting next task for host managed_node3 11728 1726882233.68424: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882233.68430: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882233.68457: getting variables 11728 1726882233.68458: in VariableManager get_vars() 11728 1726882233.68651: Calling all_inventory to load vars for managed_node3 11728 1726882233.68654: Calling groups_inventory to load vars for managed_node3 11728 1726882233.68656: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882233.68665: Calling all_plugins_play to load vars for managed_node3 11728 1726882233.68669: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882233.68672: Calling groups_plugins_play to load vars for managed_node3 11728 1726882233.70252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.71780: done with get_vars() 11728 1726882233.71806: done getting variables 11728 1726882233.71863: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:30:33 -0400 (0:00:00.061) 0:00:58.571 ****** 11728 1726882233.71906: entering _queue_task() for managed_node3/fail 11728 1726882233.72573: worker is 1 (out of 1 available) 11728 1726882233.72584: exiting _queue_task() for managed_node3/fail 11728 1726882233.72598: done queuing things up, now waiting for results queue to drain 11728 1726882233.72600: waiting for pending results... 11728 1726882233.72898: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11728 1726882233.73256: in run() - task 12673a56-9f93-5c28-a762-000000000e0d 11728 1726882233.73261: variable 'ansible_search_path' from source: unknown 11728 1726882233.73265: variable 'ansible_search_path' from source: unknown 11728 1726882233.73282: calling self._execute() 11728 1726882233.73376: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.73381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.73390: variable 'omit' from source: magic vars 11728 1726882233.74225: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.74229: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882233.74513: variable 'network_state' from source: role '' defaults 11728 1726882233.74523: Evaluated conditional (network_state != {}): False 11728 1726882233.74527: when evaluation is False, skipping this task 11728 1726882233.74529: _execute() done 11728 1726882233.74532: dumping result to json 11728 1726882233.74534: done dumping result, returning 11728 1726882233.74543: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-5c28-a762-000000000e0d] 11728 1726882233.74551: sending task result for task 12673a56-9f93-5c28-a762-000000000e0d 11728 1726882233.74732: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0d 11728 1726882233.74736: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882233.74812: no more pending results, returning what we have 11728 1726882233.74816: results queue empty 11728 1726882233.74817: checking for any_errors_fatal 11728 1726882233.74826: done checking for any_errors_fatal 11728 1726882233.74827: checking for max_fail_percentage 11728 1726882233.74829: done checking for max_fail_percentage 11728 1726882233.74830: checking to see if all hosts have failed and the running result is not ok 11728 1726882233.74830: done checking to see if all hosts have failed 11728 1726882233.74831: getting the remaining hosts for this loop 11728 1726882233.74833: done getting the remaining hosts for this loop 11728 1726882233.74836: getting the next task for host managed_node3 11728 1726882233.74843: done getting next task for host managed_node3 11728 1726882233.74847: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882233.74852: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882233.74876: getting variables 11728 1726882233.74877: in VariableManager get_vars() 11728 1726882233.74925: Calling all_inventory to load vars for managed_node3 11728 1726882233.74928: Calling groups_inventory to load vars for managed_node3 11728 1726882233.74930: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882233.74940: Calling all_plugins_play to load vars for managed_node3 11728 1726882233.74942: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882233.74945: Calling groups_plugins_play to load vars for managed_node3 11728 1726882233.77924: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.80866: done with get_vars() 11728 1726882233.81101: done getting variables 11728 1726882233.81163: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:30:33 -0400 (0:00:00.092) 0:00:58.664 ****** 11728 1726882233.81204: entering _queue_task() for managed_node3/fail 11728 1726882233.81963: worker is 1 (out of 1 available) 11728 1726882233.81975: exiting _queue_task() for managed_node3/fail 11728 1726882233.81988: done queuing things up, now waiting for results queue to drain 11728 1726882233.81990: waiting for pending results... 11728 1726882233.82613: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11728 1726882233.82928: in run() - task 12673a56-9f93-5c28-a762-000000000e0e 11728 1726882233.82942: variable 'ansible_search_path' from source: unknown 11728 1726882233.82945: variable 'ansible_search_path' from source: unknown 11728 1726882233.83039: calling self._execute() 11728 1726882233.83255: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.83261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.83270: variable 'omit' from source: magic vars 11728 1726882233.84115: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.84124: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882233.84451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882233.89400: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882233.89405: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882233.89408: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882233.89410: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882233.89533: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882233.89613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882233.89646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882233.89667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882233.89936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882233.89951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882233.90169: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.90185: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11728 1726882233.90505: variable 'ansible_distribution' from source: facts 11728 1726882233.90509: variable '__network_rh_distros' from source: role '' defaults 11728 1726882233.90522: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11728 1726882233.91282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882233.91391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882233.91396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882233.91399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882233.91401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882233.91700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882233.91703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882233.91706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882233.91742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882233.91759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882233.91918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882233.91948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882233.91976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882233.92022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882233.92137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882233.92670: variable 'network_connections' from source: task vars 11728 1726882233.92999: variable 'port2_profile' from source: play vars 11728 1726882233.93004: variable 'port2_profile' from source: play vars 11728 1726882233.93006: variable 'port1_profile' from source: play vars 11728 1726882233.93030: variable 'port1_profile' from source: play vars 11728 1726882233.93039: variable 'controller_profile' from source: play vars 11728 1726882233.93121: variable 'controller_profile' from source: play vars 11728 1726882233.93124: variable 'network_state' from source: role '' defaults 11728 1726882233.93180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882233.93372: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882233.93415: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882233.93553: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882233.93556: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882233.93568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882233.93571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882233.93573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882233.93604: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882233.93632: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11728 1726882233.93636: when evaluation is False, skipping this task 11728 1726882233.93638: _execute() done 11728 1726882233.93641: dumping result to json 11728 1726882233.93643: done dumping result, returning 11728 1726882233.93651: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-5c28-a762-000000000e0e] 11728 1726882233.93661: sending task result for task 12673a56-9f93-5c28-a762-000000000e0e 11728 1726882233.93753: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0e 11728 1726882233.93756: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11728 1726882233.93811: no more pending results, returning what we have 11728 1726882233.93815: results queue empty 11728 1726882233.93816: checking for any_errors_fatal 11728 1726882233.93823: done checking for any_errors_fatal 11728 1726882233.93823: checking for max_fail_percentage 11728 1726882233.93825: done checking for max_fail_percentage 11728 1726882233.93826: checking to see if all hosts have failed and the running result is not ok 11728 1726882233.93827: done checking to see if all hosts have failed 11728 1726882233.93827: getting the remaining hosts for this loop 11728 1726882233.93829: done getting the remaining hosts for this loop 11728 1726882233.93832: getting the next task for host managed_node3 11728 1726882233.93840: done getting next task for host managed_node3 11728 1726882233.93844: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882233.93849: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882233.93871: getting variables 11728 1726882233.93872: in VariableManager get_vars() 11728 1726882233.93922: Calling all_inventory to load vars for managed_node3 11728 1726882233.93924: Calling groups_inventory to load vars for managed_node3 11728 1726882233.93927: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882233.93936: Calling all_plugins_play to load vars for managed_node3 11728 1726882233.93939: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882233.93941: Calling groups_plugins_play to load vars for managed_node3 11728 1726882233.95687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882233.97528: done with get_vars() 11728 1726882233.97556: done getting variables 11728 1726882233.97624: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:30:33 -0400 (0:00:00.164) 0:00:58.828 ****** 11728 1726882233.97658: entering _queue_task() for managed_node3/dnf 11728 1726882233.98124: worker is 1 (out of 1 available) 11728 1726882233.98135: exiting _queue_task() for managed_node3/dnf 11728 1726882233.98146: done queuing things up, now waiting for results queue to drain 11728 1726882233.98147: waiting for pending results... 11728 1726882233.98510: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11728 1726882233.98530: in run() - task 12673a56-9f93-5c28-a762-000000000e0f 11728 1726882233.98550: variable 'ansible_search_path' from source: unknown 11728 1726882233.98553: variable 'ansible_search_path' from source: unknown 11728 1726882233.98662: calling self._execute() 11728 1726882233.98819: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882233.98824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882233.98833: variable 'omit' from source: magic vars 11728 1726882233.99702: variable 'ansible_distribution_major_version' from source: facts 11728 1726882233.99716: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.00024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.03920: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.04003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.04040: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.04247: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.04269: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.04699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.04703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.04705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.04708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.04710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.05099: variable 'ansible_distribution' from source: facts 11728 1726882234.05102: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.05104: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11728 1726882234.05228: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.05476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.05503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.05621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.05664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.05678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.05834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.05860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.05883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.05946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.05984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.06058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.06082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.06110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.06283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.06359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.06708: variable 'network_connections' from source: task vars 11728 1726882234.06719: variable 'port2_profile' from source: play vars 11728 1726882234.06782: variable 'port2_profile' from source: play vars 11728 1726882234.06795: variable 'port1_profile' from source: play vars 11728 1726882234.06859: variable 'port1_profile' from source: play vars 11728 1726882234.06867: variable 'controller_profile' from source: play vars 11728 1726882234.07053: variable 'controller_profile' from source: play vars 11728 1726882234.07269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882234.07590: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882234.07632: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882234.07787: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882234.07819: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882234.07861: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882234.07881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882234.08032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.08036: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882234.08054: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882234.08310: variable 'network_connections' from source: task vars 11728 1726882234.08314: variable 'port2_profile' from source: play vars 11728 1726882234.08378: variable 'port2_profile' from source: play vars 11728 1726882234.08385: variable 'port1_profile' from source: play vars 11728 1726882234.08452: variable 'port1_profile' from source: play vars 11728 1726882234.08459: variable 'controller_profile' from source: play vars 11728 1726882234.08519: variable 'controller_profile' from source: play vars 11728 1726882234.08549: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882234.08552: when evaluation is False, skipping this task 11728 1726882234.08554: _execute() done 11728 1726882234.08557: dumping result to json 11728 1726882234.08559: done dumping result, returning 11728 1726882234.08598: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000e0f] 11728 1726882234.08602: sending task result for task 12673a56-9f93-5c28-a762-000000000e0f 11728 1726882234.08670: done sending task result for task 12673a56-9f93-5c28-a762-000000000e0f 11728 1726882234.08672: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882234.08731: no more pending results, returning what we have 11728 1726882234.08735: results queue empty 11728 1726882234.08736: checking for any_errors_fatal 11728 1726882234.08740: done checking for any_errors_fatal 11728 1726882234.08741: checking for max_fail_percentage 11728 1726882234.08743: done checking for max_fail_percentage 11728 1726882234.08744: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.08744: done checking to see if all hosts have failed 11728 1726882234.08745: getting the remaining hosts for this loop 11728 1726882234.08747: done getting the remaining hosts for this loop 11728 1726882234.08750: getting the next task for host managed_node3 11728 1726882234.08757: done getting next task for host managed_node3 11728 1726882234.08761: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882234.08765: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.08788: getting variables 11728 1726882234.08790: in VariableManager get_vars() 11728 1726882234.08838: Calling all_inventory to load vars for managed_node3 11728 1726882234.08841: Calling groups_inventory to load vars for managed_node3 11728 1726882234.08843: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.08853: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.08855: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.08858: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.10629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.12253: done with get_vars() 11728 1726882234.12276: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11728 1726882234.12489: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:30:34 -0400 (0:00:00.148) 0:00:58.977 ****** 11728 1726882234.12529: entering _queue_task() for managed_node3/yum 11728 1726882234.12876: worker is 1 (out of 1 available) 11728 1726882234.12889: exiting _queue_task() for managed_node3/yum 11728 1726882234.12905: done queuing things up, now waiting for results queue to drain 11728 1726882234.12908: waiting for pending results... 11728 1726882234.13310: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11728 1726882234.13332: in run() - task 12673a56-9f93-5c28-a762-000000000e10 11728 1726882234.13352: variable 'ansible_search_path' from source: unknown 11728 1726882234.13360: variable 'ansible_search_path' from source: unknown 11728 1726882234.13400: calling self._execute() 11728 1726882234.13496: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.13509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.13523: variable 'omit' from source: magic vars 11728 1726882234.13881: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.13900: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.14298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.16212: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.16284: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.16331: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.16373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.16407: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.16495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.16531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.16564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.16611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.16630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.16729: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.16751: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11728 1726882234.16760: when evaluation is False, skipping this task 11728 1726882234.16768: _execute() done 11728 1726882234.16776: dumping result to json 11728 1726882234.16783: done dumping result, returning 11728 1726882234.16798: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000e10] 11728 1726882234.16809: sending task result for task 12673a56-9f93-5c28-a762-000000000e10 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11728 1726882234.16971: no more pending results, returning what we have 11728 1726882234.16974: results queue empty 11728 1726882234.16975: checking for any_errors_fatal 11728 1726882234.16980: done checking for any_errors_fatal 11728 1726882234.16980: checking for max_fail_percentage 11728 1726882234.16982: done checking for max_fail_percentage 11728 1726882234.16983: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.16984: done checking to see if all hosts have failed 11728 1726882234.16984: getting the remaining hosts for this loop 11728 1726882234.16986: done getting the remaining hosts for this loop 11728 1726882234.16990: getting the next task for host managed_node3 11728 1726882234.17001: done getting next task for host managed_node3 11728 1726882234.17005: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882234.17010: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.17038: getting variables 11728 1726882234.17040: in VariableManager get_vars() 11728 1726882234.17086: Calling all_inventory to load vars for managed_node3 11728 1726882234.17089: Calling groups_inventory to load vars for managed_node3 11728 1726882234.17091: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.17300: done sending task result for task 12673a56-9f93-5c28-a762-000000000e10 11728 1726882234.17304: WORKER PROCESS EXITING 11728 1726882234.17313: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.17317: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.17321: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.18596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.20254: done with get_vars() 11728 1726882234.20275: done getting variables 11728 1726882234.20338: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:30:34 -0400 (0:00:00.078) 0:00:59.056 ****** 11728 1726882234.20375: entering _queue_task() for managed_node3/fail 11728 1726882234.20709: worker is 1 (out of 1 available) 11728 1726882234.20723: exiting _queue_task() for managed_node3/fail 11728 1726882234.20736: done queuing things up, now waiting for results queue to drain 11728 1726882234.20737: waiting for pending results... 11728 1726882234.21029: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11728 1726882234.21186: in run() - task 12673a56-9f93-5c28-a762-000000000e11 11728 1726882234.21396: variable 'ansible_search_path' from source: unknown 11728 1726882234.21401: variable 'ansible_search_path' from source: unknown 11728 1726882234.21404: calling self._execute() 11728 1726882234.21407: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.21409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.21412: variable 'omit' from source: magic vars 11728 1726882234.21746: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.21764: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.21888: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.22089: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.24328: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.24408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.24451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.24501: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.24534: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.24626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.24660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.24696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.24742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.24760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.24816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.24845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.24874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.24924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.24943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.24987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.25023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.25053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.25097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.25200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.25308: variable 'network_connections' from source: task vars 11728 1726882234.25319: variable 'port2_profile' from source: play vars 11728 1726882234.25376: variable 'port2_profile' from source: play vars 11728 1726882234.25379: variable 'port1_profile' from source: play vars 11728 1726882234.25428: variable 'port1_profile' from source: play vars 11728 1726882234.25436: variable 'controller_profile' from source: play vars 11728 1726882234.25483: variable 'controller_profile' from source: play vars 11728 1726882234.25536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882234.25667: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882234.25699: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882234.25723: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882234.25744: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882234.25776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882234.25792: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882234.25818: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.25833: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882234.25876: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882234.26028: variable 'network_connections' from source: task vars 11728 1726882234.26031: variable 'port2_profile' from source: play vars 11728 1726882234.26074: variable 'port2_profile' from source: play vars 11728 1726882234.26083: variable 'port1_profile' from source: play vars 11728 1726882234.26127: variable 'port1_profile' from source: play vars 11728 1726882234.26133: variable 'controller_profile' from source: play vars 11728 1726882234.26176: variable 'controller_profile' from source: play vars 11728 1726882234.26198: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882234.26210: when evaluation is False, skipping this task 11728 1726882234.26213: _execute() done 11728 1726882234.26215: dumping result to json 11728 1726882234.26218: done dumping result, returning 11728 1726882234.26220: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000e11] 11728 1726882234.26222: sending task result for task 12673a56-9f93-5c28-a762-000000000e11 11728 1726882234.26315: done sending task result for task 12673a56-9f93-5c28-a762-000000000e11 11728 1726882234.26318: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882234.26378: no more pending results, returning what we have 11728 1726882234.26382: results queue empty 11728 1726882234.26383: checking for any_errors_fatal 11728 1726882234.26389: done checking for any_errors_fatal 11728 1726882234.26390: checking for max_fail_percentage 11728 1726882234.26392: done checking for max_fail_percentage 11728 1726882234.26396: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.26397: done checking to see if all hosts have failed 11728 1726882234.26398: getting the remaining hosts for this loop 11728 1726882234.26400: done getting the remaining hosts for this loop 11728 1726882234.26403: getting the next task for host managed_node3 11728 1726882234.26411: done getting next task for host managed_node3 11728 1726882234.26415: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11728 1726882234.26421: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.26446: getting variables 11728 1726882234.26448: in VariableManager get_vars() 11728 1726882234.26490: Calling all_inventory to load vars for managed_node3 11728 1726882234.26501: Calling groups_inventory to load vars for managed_node3 11728 1726882234.26505: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.26514: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.26516: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.26519: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.27539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.28696: done with get_vars() 11728 1726882234.28721: done getting variables 11728 1726882234.28765: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:30:34 -0400 (0:00:00.084) 0:00:59.140 ****** 11728 1726882234.28797: entering _queue_task() for managed_node3/package 11728 1726882234.29065: worker is 1 (out of 1 available) 11728 1726882234.29081: exiting _queue_task() for managed_node3/package 11728 1726882234.29097: done queuing things up, now waiting for results queue to drain 11728 1726882234.29099: waiting for pending results... 11728 1726882234.29291: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11728 1726882234.29408: in run() - task 12673a56-9f93-5c28-a762-000000000e12 11728 1726882234.29421: variable 'ansible_search_path' from source: unknown 11728 1726882234.29425: variable 'ansible_search_path' from source: unknown 11728 1726882234.29457: calling self._execute() 11728 1726882234.29532: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.29537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.29545: variable 'omit' from source: magic vars 11728 1726882234.29834: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.29843: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.29983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882234.30300: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882234.30304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882234.30322: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882234.30355: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882234.30467: variable 'network_packages' from source: role '' defaults 11728 1726882234.30571: variable '__network_provider_setup' from source: role '' defaults 11728 1726882234.30581: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882234.30645: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882234.30656: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882234.30859: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882234.31098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.32546: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.32857: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.32884: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.32912: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.32932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.32994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.33017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.33034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.33063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.33074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.33110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.33126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.33142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.33167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.33178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.33366: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882234.33484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.33510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.33799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.33802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.33805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.33807: variable 'ansible_python' from source: facts 11728 1726882234.33809: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882234.33811: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882234.33843: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882234.33965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.33992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.34022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.34060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.34073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.34121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.34152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.34164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.34198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.34215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.34332: variable 'network_connections' from source: task vars 11728 1726882234.34335: variable 'port2_profile' from source: play vars 11728 1726882234.34442: variable 'port2_profile' from source: play vars 11728 1726882234.34445: variable 'port1_profile' from source: play vars 11728 1726882234.34590: variable 'port1_profile' from source: play vars 11728 1726882234.34595: variable 'controller_profile' from source: play vars 11728 1726882234.34626: variable 'controller_profile' from source: play vars 11728 1726882234.34696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882234.34720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882234.34750: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.34779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882234.34913: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.35097: variable 'network_connections' from source: task vars 11728 1726882234.35105: variable 'port2_profile' from source: play vars 11728 1726882234.35199: variable 'port2_profile' from source: play vars 11728 1726882234.35210: variable 'port1_profile' from source: play vars 11728 1726882234.35353: variable 'port1_profile' from source: play vars 11728 1726882234.35360: variable 'controller_profile' from source: play vars 11728 1726882234.35412: variable 'controller_profile' from source: play vars 11728 1726882234.35442: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882234.35538: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.35772: variable 'network_connections' from source: task vars 11728 1726882234.35775: variable 'port2_profile' from source: play vars 11728 1726882234.35831: variable 'port2_profile' from source: play vars 11728 1726882234.35837: variable 'port1_profile' from source: play vars 11728 1726882234.35880: variable 'port1_profile' from source: play vars 11728 1726882234.35890: variable 'controller_profile' from source: play vars 11728 1726882234.35937: variable 'controller_profile' from source: play vars 11728 1726882234.35954: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882234.36011: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882234.36206: variable 'network_connections' from source: task vars 11728 1726882234.36209: variable 'port2_profile' from source: play vars 11728 1726882234.36256: variable 'port2_profile' from source: play vars 11728 1726882234.36262: variable 'port1_profile' from source: play vars 11728 1726882234.36311: variable 'port1_profile' from source: play vars 11728 1726882234.36317: variable 'controller_profile' from source: play vars 11728 1726882234.36364: variable 'controller_profile' from source: play vars 11728 1726882234.36405: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882234.36449: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882234.36455: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882234.36497: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882234.36633: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882234.41900: variable 'network_connections' from source: task vars 11728 1726882234.41904: variable 'port2_profile' from source: play vars 11728 1726882234.41906: variable 'port2_profile' from source: play vars 11728 1726882234.41908: variable 'port1_profile' from source: play vars 11728 1726882234.41910: variable 'port1_profile' from source: play vars 11728 1726882234.41912: variable 'controller_profile' from source: play vars 11728 1726882234.41958: variable 'controller_profile' from source: play vars 11728 1726882234.41971: variable 'ansible_distribution' from source: facts 11728 1726882234.41978: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.41986: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.42010: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882234.42182: variable 'ansible_distribution' from source: facts 11728 1726882234.42191: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.42204: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.42222: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882234.42383: variable 'ansible_distribution' from source: facts 11728 1726882234.42395: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.42406: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.42445: variable 'network_provider' from source: set_fact 11728 1726882234.42466: variable 'ansible_facts' from source: unknown 11728 1726882234.43062: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11728 1726882234.43065: when evaluation is False, skipping this task 11728 1726882234.43068: _execute() done 11728 1726882234.43070: dumping result to json 11728 1726882234.43072: done dumping result, returning 11728 1726882234.43078: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-5c28-a762-000000000e12] 11728 1726882234.43080: sending task result for task 12673a56-9f93-5c28-a762-000000000e12 11728 1726882234.43177: done sending task result for task 12673a56-9f93-5c28-a762-000000000e12 11728 1726882234.43180: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11728 1726882234.43256: no more pending results, returning what we have 11728 1726882234.43260: results queue empty 11728 1726882234.43261: checking for any_errors_fatal 11728 1726882234.43267: done checking for any_errors_fatal 11728 1726882234.43268: checking for max_fail_percentage 11728 1726882234.43270: done checking for max_fail_percentage 11728 1726882234.43271: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.43272: done checking to see if all hosts have failed 11728 1726882234.43272: getting the remaining hosts for this loop 11728 1726882234.43274: done getting the remaining hosts for this loop 11728 1726882234.43283: getting the next task for host managed_node3 11728 1726882234.43298: done getting next task for host managed_node3 11728 1726882234.43302: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882234.43307: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.43331: getting variables 11728 1726882234.43332: in VariableManager get_vars() 11728 1726882234.43373: Calling all_inventory to load vars for managed_node3 11728 1726882234.43376: Calling groups_inventory to load vars for managed_node3 11728 1726882234.43378: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.43387: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.43389: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.43392: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.50228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.51846: done with get_vars() 11728 1726882234.51878: done getting variables 11728 1726882234.51968: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:30:34 -0400 (0:00:00.232) 0:00:59.372 ****** 11728 1726882234.52179: entering _queue_task() for managed_node3/package 11728 1726882234.52657: worker is 1 (out of 1 available) 11728 1726882234.52667: exiting _queue_task() for managed_node3/package 11728 1726882234.52678: done queuing things up, now waiting for results queue to drain 11728 1726882234.52680: waiting for pending results... 11728 1726882234.53075: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11728 1726882234.53392: in run() - task 12673a56-9f93-5c28-a762-000000000e13 11728 1726882234.53402: variable 'ansible_search_path' from source: unknown 11728 1726882234.53405: variable 'ansible_search_path' from source: unknown 11728 1726882234.53410: calling self._execute() 11728 1726882234.53779: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.53835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.53840: variable 'omit' from source: magic vars 11728 1726882234.54345: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.54358: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.54487: variable 'network_state' from source: role '' defaults 11728 1726882234.54500: Evaluated conditional (network_state != {}): False 11728 1726882234.54503: when evaluation is False, skipping this task 11728 1726882234.54545: _execute() done 11728 1726882234.54549: dumping result to json 11728 1726882234.54553: done dumping result, returning 11728 1726882234.54556: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000e13] 11728 1726882234.54559: sending task result for task 12673a56-9f93-5c28-a762-000000000e13 11728 1726882234.54742: done sending task result for task 12673a56-9f93-5c28-a762-000000000e13 11728 1726882234.54748: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882234.54799: no more pending results, returning what we have 11728 1726882234.54804: results queue empty 11728 1726882234.54806: checking for any_errors_fatal 11728 1726882234.54813: done checking for any_errors_fatal 11728 1726882234.54814: checking for max_fail_percentage 11728 1726882234.54816: done checking for max_fail_percentage 11728 1726882234.54817: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.54817: done checking to see if all hosts have failed 11728 1726882234.54818: getting the remaining hosts for this loop 11728 1726882234.54821: done getting the remaining hosts for this loop 11728 1726882234.54824: getting the next task for host managed_node3 11728 1726882234.54834: done getting next task for host managed_node3 11728 1726882234.54838: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882234.54844: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.54869: getting variables 11728 1726882234.54871: in VariableManager get_vars() 11728 1726882234.54922: Calling all_inventory to load vars for managed_node3 11728 1726882234.54925: Calling groups_inventory to load vars for managed_node3 11728 1726882234.54928: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.54939: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.54942: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.54945: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.56445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.58051: done with get_vars() 11728 1726882234.58075: done getting variables 11728 1726882234.58135: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:30:34 -0400 (0:00:00.061) 0:00:59.434 ****** 11728 1726882234.58179: entering _queue_task() for managed_node3/package 11728 1726882234.58524: worker is 1 (out of 1 available) 11728 1726882234.58539: exiting _queue_task() for managed_node3/package 11728 1726882234.58553: done queuing things up, now waiting for results queue to drain 11728 1726882234.58554: waiting for pending results... 11728 1726882234.58848: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11728 1726882234.59058: in run() - task 12673a56-9f93-5c28-a762-000000000e14 11728 1726882234.59109: variable 'ansible_search_path' from source: unknown 11728 1726882234.59113: variable 'ansible_search_path' from source: unknown 11728 1726882234.59149: calling self._execute() 11728 1726882234.59307: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.59311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.59314: variable 'omit' from source: magic vars 11728 1726882234.59788: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.60200: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.60204: variable 'network_state' from source: role '' defaults 11728 1726882234.60207: Evaluated conditional (network_state != {}): False 11728 1726882234.60210: when evaluation is False, skipping this task 11728 1726882234.60212: _execute() done 11728 1726882234.60214: dumping result to json 11728 1726882234.60216: done dumping result, returning 11728 1726882234.60219: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-5c28-a762-000000000e14] 11728 1726882234.60222: sending task result for task 12673a56-9f93-5c28-a762-000000000e14 11728 1726882234.60292: done sending task result for task 12673a56-9f93-5c28-a762-000000000e14 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882234.60343: no more pending results, returning what we have 11728 1726882234.60346: results queue empty 11728 1726882234.60348: checking for any_errors_fatal 11728 1726882234.60353: done checking for any_errors_fatal 11728 1726882234.60353: checking for max_fail_percentage 11728 1726882234.60355: done checking for max_fail_percentage 11728 1726882234.60356: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.60356: done checking to see if all hosts have failed 11728 1726882234.60357: getting the remaining hosts for this loop 11728 1726882234.60359: done getting the remaining hosts for this loop 11728 1726882234.60362: getting the next task for host managed_node3 11728 1726882234.60369: done getting next task for host managed_node3 11728 1726882234.60373: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882234.60378: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.60502: WORKER PROCESS EXITING 11728 1726882234.60515: getting variables 11728 1726882234.60517: in VariableManager get_vars() 11728 1726882234.60554: Calling all_inventory to load vars for managed_node3 11728 1726882234.60557: Calling groups_inventory to load vars for managed_node3 11728 1726882234.60559: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.60567: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.60569: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.60571: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.62036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.63631: done with get_vars() 11728 1726882234.63655: done getting variables 11728 1726882234.63720: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:30:34 -0400 (0:00:00.055) 0:00:59.490 ****** 11728 1726882234.63768: entering _queue_task() for managed_node3/service 11728 1726882234.64229: worker is 1 (out of 1 available) 11728 1726882234.64242: exiting _queue_task() for managed_node3/service 11728 1726882234.64254: done queuing things up, now waiting for results queue to drain 11728 1726882234.64255: waiting for pending results... 11728 1726882234.64463: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11728 1726882234.64631: in run() - task 12673a56-9f93-5c28-a762-000000000e15 11728 1726882234.64645: variable 'ansible_search_path' from source: unknown 11728 1726882234.64654: variable 'ansible_search_path' from source: unknown 11728 1726882234.64685: calling self._execute() 11728 1726882234.64781: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.64785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.64800: variable 'omit' from source: magic vars 11728 1726882234.65501: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.65505: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.65508: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.65527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.67798: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.67870: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.67911: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.67950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.67974: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.68059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.68089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.68120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.68163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.68177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.68235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.68263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.68500: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.68504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.68506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.68509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.68511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.68514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.68517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.68519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.68672: variable 'network_connections' from source: task vars 11728 1726882234.68689: variable 'port2_profile' from source: play vars 11728 1726882234.68756: variable 'port2_profile' from source: play vars 11728 1726882234.68767: variable 'port1_profile' from source: play vars 11728 1726882234.68835: variable 'port1_profile' from source: play vars 11728 1726882234.68843: variable 'controller_profile' from source: play vars 11728 1726882234.68908: variable 'controller_profile' from source: play vars 11728 1726882234.68965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882234.69128: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882234.69165: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882234.69200: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882234.69235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882234.69279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882234.69302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882234.69334: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.69359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882234.69415: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882234.69666: variable 'network_connections' from source: task vars 11728 1726882234.69669: variable 'port2_profile' from source: play vars 11728 1726882234.69729: variable 'port2_profile' from source: play vars 11728 1726882234.69739: variable 'port1_profile' from source: play vars 11728 1726882234.69803: variable 'port1_profile' from source: play vars 11728 1726882234.69811: variable 'controller_profile' from source: play vars 11728 1726882234.69875: variable 'controller_profile' from source: play vars 11728 1726882234.69901: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11728 1726882234.69913: when evaluation is False, skipping this task 11728 1726882234.69915: _execute() done 11728 1726882234.69918: dumping result to json 11728 1726882234.69921: done dumping result, returning 11728 1726882234.69923: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-5c28-a762-000000000e15] 11728 1726882234.69925: sending task result for task 12673a56-9f93-5c28-a762-000000000e15 11728 1726882234.70027: done sending task result for task 12673a56-9f93-5c28-a762-000000000e15 11728 1726882234.70030: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11728 1726882234.70081: no more pending results, returning what we have 11728 1726882234.70086: results queue empty 11728 1726882234.70087: checking for any_errors_fatal 11728 1726882234.70097: done checking for any_errors_fatal 11728 1726882234.70098: checking for max_fail_percentage 11728 1726882234.70101: done checking for max_fail_percentage 11728 1726882234.70102: checking to see if all hosts have failed and the running result is not ok 11728 1726882234.70103: done checking to see if all hosts have failed 11728 1726882234.70103: getting the remaining hosts for this loop 11728 1726882234.70105: done getting the remaining hosts for this loop 11728 1726882234.70109: getting the next task for host managed_node3 11728 1726882234.70118: done getting next task for host managed_node3 11728 1726882234.70124: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882234.70129: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882234.70155: getting variables 11728 1726882234.70157: in VariableManager get_vars() 11728 1726882234.70415: Calling all_inventory to load vars for managed_node3 11728 1726882234.70419: Calling groups_inventory to load vars for managed_node3 11728 1726882234.70422: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882234.70432: Calling all_plugins_play to load vars for managed_node3 11728 1726882234.70436: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882234.70439: Calling groups_plugins_play to load vars for managed_node3 11728 1726882234.71892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882234.73552: done with get_vars() 11728 1726882234.73578: done getting variables 11728 1726882234.73640: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:30:34 -0400 (0:00:00.099) 0:00:59.589 ****** 11728 1726882234.73681: entering _queue_task() for managed_node3/service 11728 1726882234.74040: worker is 1 (out of 1 available) 11728 1726882234.74053: exiting _queue_task() for managed_node3/service 11728 1726882234.74065: done queuing things up, now waiting for results queue to drain 11728 1726882234.74067: waiting for pending results... 11728 1726882234.74614: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11728 1726882234.74620: in run() - task 12673a56-9f93-5c28-a762-000000000e16 11728 1726882234.74624: variable 'ansible_search_path' from source: unknown 11728 1726882234.74628: variable 'ansible_search_path' from source: unknown 11728 1726882234.74632: calling self._execute() 11728 1726882234.74675: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.74679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.74688: variable 'omit' from source: magic vars 11728 1726882234.75125: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.75137: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882234.75320: variable 'network_provider' from source: set_fact 11728 1726882234.75324: variable 'network_state' from source: role '' defaults 11728 1726882234.75335: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11728 1726882234.75341: variable 'omit' from source: magic vars 11728 1726882234.75428: variable 'omit' from source: magic vars 11728 1726882234.75456: variable 'network_service_name' from source: role '' defaults 11728 1726882234.75528: variable 'network_service_name' from source: role '' defaults 11728 1726882234.75642: variable '__network_provider_setup' from source: role '' defaults 11728 1726882234.75648: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882234.75711: variable '__network_service_name_default_nm' from source: role '' defaults 11728 1726882234.75730: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882234.75791: variable '__network_packages_default_nm' from source: role '' defaults 11728 1726882234.76031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882234.78586: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882234.78668: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882234.78706: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882234.78740: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882234.78800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882234.78849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.78884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.78912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.78977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.78981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.79024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.79086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.79089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.79113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.79127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.79360: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11728 1726882234.79519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.79523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.79540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.79578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.79592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.79690: variable 'ansible_python' from source: facts 11728 1726882234.79704: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11728 1726882234.79845: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882234.79870: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882234.79996: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.80019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.80043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.80081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.80099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.80171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882234.80183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882234.80186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.80222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882234.80236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882234.80523: variable 'network_connections' from source: task vars 11728 1726882234.80526: variable 'port2_profile' from source: play vars 11728 1726882234.80529: variable 'port2_profile' from source: play vars 11728 1726882234.80531: variable 'port1_profile' from source: play vars 11728 1726882234.80547: variable 'port1_profile' from source: play vars 11728 1726882234.80559: variable 'controller_profile' from source: play vars 11728 1726882234.80637: variable 'controller_profile' from source: play vars 11728 1726882234.80749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882234.80960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882234.81011: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882234.81051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882234.81105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882234.81164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882234.81203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882234.81236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882234.81267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882234.81327: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.81621: variable 'network_connections' from source: task vars 11728 1726882234.81719: variable 'port2_profile' from source: play vars 11728 1726882234.81722: variable 'port2_profile' from source: play vars 11728 1726882234.81725: variable 'port1_profile' from source: play vars 11728 1726882234.81787: variable 'port1_profile' from source: play vars 11728 1726882234.81801: variable 'controller_profile' from source: play vars 11728 1726882234.81876: variable 'controller_profile' from source: play vars 11728 1726882234.81912: variable '__network_packages_default_wireless' from source: role '' defaults 11728 1726882234.81999: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882234.82308: variable 'network_connections' from source: task vars 11728 1726882234.82312: variable 'port2_profile' from source: play vars 11728 1726882234.82386: variable 'port2_profile' from source: play vars 11728 1726882234.82397: variable 'port1_profile' from source: play vars 11728 1726882234.82463: variable 'port1_profile' from source: play vars 11728 1726882234.82475: variable 'controller_profile' from source: play vars 11728 1726882234.82585: variable 'controller_profile' from source: play vars 11728 1726882234.82589: variable '__network_packages_default_team' from source: role '' defaults 11728 1726882234.82661: variable '__network_team_connections_defined' from source: role '' defaults 11728 1726882234.82978: variable 'network_connections' from source: task vars 11728 1726882234.82981: variable 'port2_profile' from source: play vars 11728 1726882234.83062: variable 'port2_profile' from source: play vars 11728 1726882234.83070: variable 'port1_profile' from source: play vars 11728 1726882234.83200: variable 'port1_profile' from source: play vars 11728 1726882234.83204: variable 'controller_profile' from source: play vars 11728 1726882234.83221: variable 'controller_profile' from source: play vars 11728 1726882234.83284: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882234.83350: variable '__network_service_name_default_initscripts' from source: role '' defaults 11728 1726882234.83362: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882234.83423: variable '__network_packages_default_initscripts' from source: role '' defaults 11728 1726882234.83660: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11728 1726882234.84214: variable 'network_connections' from source: task vars 11728 1726882234.84409: variable 'port2_profile' from source: play vars 11728 1726882234.84412: variable 'port2_profile' from source: play vars 11728 1726882234.84415: variable 'port1_profile' from source: play vars 11728 1726882234.84417: variable 'port1_profile' from source: play vars 11728 1726882234.84419: variable 'controller_profile' from source: play vars 11728 1726882234.84422: variable 'controller_profile' from source: play vars 11728 1726882234.84427: variable 'ansible_distribution' from source: facts 11728 1726882234.84432: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.84443: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.84458: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11728 1726882234.84640: variable 'ansible_distribution' from source: facts 11728 1726882234.84648: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.84657: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.84671: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11728 1726882234.84849: variable 'ansible_distribution' from source: facts 11728 1726882234.84852: variable '__network_rh_distros' from source: role '' defaults 11728 1726882234.84863: variable 'ansible_distribution_major_version' from source: facts 11728 1726882234.84904: variable 'network_provider' from source: set_fact 11728 1726882234.84928: variable 'omit' from source: magic vars 11728 1726882234.84956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882234.84997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882234.85013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882234.85031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882234.85042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882234.85070: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882234.85075: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.85190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.85203: Set connection var ansible_connection to ssh 11728 1726882234.85206: Set connection var ansible_shell_executable to /bin/sh 11728 1726882234.85209: Set connection var ansible_timeout to 10 11728 1726882234.85211: Set connection var ansible_shell_type to sh 11728 1726882234.85218: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882234.85223: Set connection var ansible_pipelining to False 11728 1726882234.85248: variable 'ansible_shell_executable' from source: unknown 11728 1726882234.85251: variable 'ansible_connection' from source: unknown 11728 1726882234.85254: variable 'ansible_module_compression' from source: unknown 11728 1726882234.85256: variable 'ansible_shell_type' from source: unknown 11728 1726882234.85259: variable 'ansible_shell_executable' from source: unknown 11728 1726882234.85261: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882234.85265: variable 'ansible_pipelining' from source: unknown 11728 1726882234.85267: variable 'ansible_timeout' from source: unknown 11728 1726882234.85271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882234.85388: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882234.85402: variable 'omit' from source: magic vars 11728 1726882234.85409: starting attempt loop 11728 1726882234.85412: running the handler 11728 1726882234.85487: variable 'ansible_facts' from source: unknown 11728 1726882234.86251: _low_level_execute_command(): starting 11728 1726882234.86257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882234.87085: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.87176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882234.87183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882234.87186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882234.87272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882234.88965: stdout chunk (state=3): >>>/root <<< 11728 1726882234.89101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882234.89131: stdout chunk (state=3): >>><<< 11728 1726882234.89134: stderr chunk (state=3): >>><<< 11728 1726882234.89261: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882234.89266: _low_level_execute_command(): starting 11728 1726882234.89268: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224 `" && echo ansible-tmp-1726882234.8916-14621-128355273977224="` echo /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224 `" ) && sleep 0' 11728 1726882234.89817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.89861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882234.89900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882234.89904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882234.89969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882234.91845: stdout chunk (state=3): >>>ansible-tmp-1726882234.8916-14621-128355273977224=/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224 <<< 11728 1726882234.92080: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882234.92083: stdout chunk (state=3): >>><<< 11728 1726882234.92085: stderr chunk (state=3): >>><<< 11728 1726882234.92104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882234.8916-14621-128355273977224=/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882234.92142: variable 'ansible_module_compression' from source: unknown 11728 1726882234.92292: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11728 1726882234.92299: variable 'ansible_facts' from source: unknown 11728 1726882234.92496: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py 11728 1726882234.92608: Sending initial data 11728 1726882234.92612: Sent initial data (153 bytes) 11728 1726882234.93036: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882234.93040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882234.93043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.93045: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882234.93047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.93086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882234.93110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882234.93151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882234.94676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882234.94711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882234.94772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpe7zvp4gh /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py <<< 11728 1726882234.94777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py" <<< 11728 1726882234.94841: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpe7zvp4gh" to remote "/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py" <<< 11728 1726882234.95900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882234.95981: stderr chunk (state=3): >>><<< 11728 1726882234.95985: stdout chunk (state=3): >>><<< 11728 1726882234.95987: done transferring module to remote 11728 1726882234.95989: _low_level_execute_command(): starting 11728 1726882234.95991: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/ /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py && sleep 0' 11728 1726882234.96365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882234.96368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882234.96411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.96450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882234.96453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882234.96508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882234.98427: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882234.98444: stderr chunk (state=3): >>><<< 11728 1726882234.98447: stdout chunk (state=3): >>><<< 11728 1726882234.98458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882234.98461: _low_level_execute_command(): starting 11728 1726882234.98466: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/AnsiballZ_systemd.py && sleep 0' 11728 1726882234.98860: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882234.98864: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.98889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882234.98892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882234.98936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882234.98939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882234.98952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882234.99023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.27857: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10539008", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298463744", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1025072000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpR<<< 11728 1726882235.27865: stdout chunk (state=3): >>>eceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11728 1726882235.29603: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882235.29632: stderr chunk (state=3): >>><<< 11728 1726882235.29635: stdout chunk (state=3): >>><<< 11728 1726882235.29651: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "711", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainStartTimestampMonotonic": "33869352", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ExecMainHandoffTimestampMonotonic": "33887880", "ExecMainPID": "711", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10539008", "MemoryPeak": "13234176", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3298463744", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1025072000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service shutdown.target cloud-init.service multi-user.target", "After": "system.slice network-pre.target dbus.socket systemd-journald.socket sysinit.target cloud-init-local.service dbus-broker.service basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:09 EDT", "StateChangeTimestampMonotonic": "456380513", "InactiveExitTimestamp": "Fri 2024-09-20 21:22:06 EDT", "InactiveExitTimestampMonotonic": "33869684", "ActiveEnterTimestamp": "Fri 2024-09-20 21:22:07 EDT", "ActiveEnterTimestampMonotonic": "34618487", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:22:06 EDT", "ConditionTimestampMonotonic": "33868497", "AssertTimestamp": "Fri 2024-09-20 21:22:06 EDT", "AssertTimestampMonotonic": "33868500", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "4af90bef47894571b134cea51cfa03e2", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882235.29773: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882235.29788: _low_level_execute_command(): starting 11728 1726882235.29795: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882234.8916-14621-128355273977224/ > /dev/null 2>&1 && sleep 0' 11728 1726882235.30262: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.30265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882235.30267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.30270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882235.30272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.30332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882235.30339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.30341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.30387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.32152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882235.32181: stderr chunk (state=3): >>><<< 11728 1726882235.32184: stdout chunk (state=3): >>><<< 11728 1726882235.32201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882235.32208: handler run complete 11728 1726882235.32250: attempt loop complete, returning result 11728 1726882235.32253: _execute() done 11728 1726882235.32256: dumping result to json 11728 1726882235.32270: done dumping result, returning 11728 1726882235.32279: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-5c28-a762-000000000e16] 11728 1726882235.32283: sending task result for task 12673a56-9f93-5c28-a762-000000000e16 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882235.32791: no more pending results, returning what we have 11728 1726882235.32797: results queue empty 11728 1726882235.32798: checking for any_errors_fatal 11728 1726882235.32801: done checking for any_errors_fatal 11728 1726882235.32802: checking for max_fail_percentage 11728 1726882235.32803: done checking for max_fail_percentage 11728 1726882235.32804: checking to see if all hosts have failed and the running result is not ok 11728 1726882235.32804: done checking to see if all hosts have failed 11728 1726882235.32805: getting the remaining hosts for this loop 11728 1726882235.32806: done getting the remaining hosts for this loop 11728 1726882235.32808: getting the next task for host managed_node3 11728 1726882235.32817: done getting next task for host managed_node3 11728 1726882235.32820: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882235.32826: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882235.32835: done sending task result for task 12673a56-9f93-5c28-a762-000000000e16 11728 1726882235.32839: WORKER PROCESS EXITING 11728 1726882235.32845: getting variables 11728 1726882235.32846: in VariableManager get_vars() 11728 1726882235.32872: Calling all_inventory to load vars for managed_node3 11728 1726882235.32874: Calling groups_inventory to load vars for managed_node3 11728 1726882235.32875: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882235.32882: Calling all_plugins_play to load vars for managed_node3 11728 1726882235.32883: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882235.32885: Calling groups_plugins_play to load vars for managed_node3 11728 1726882235.33545: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882235.34410: done with get_vars() 11728 1726882235.34428: done getting variables 11728 1726882235.34472: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:30:35 -0400 (0:00:00.608) 0:01:00.197 ****** 11728 1726882235.34507: entering _queue_task() for managed_node3/service 11728 1726882235.34761: worker is 1 (out of 1 available) 11728 1726882235.34775: exiting _queue_task() for managed_node3/service 11728 1726882235.34787: done queuing things up, now waiting for results queue to drain 11728 1726882235.34789: waiting for pending results... 11728 1726882235.34984: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11728 1726882235.35096: in run() - task 12673a56-9f93-5c28-a762-000000000e17 11728 1726882235.35110: variable 'ansible_search_path' from source: unknown 11728 1726882235.35115: variable 'ansible_search_path' from source: unknown 11728 1726882235.35146: calling self._execute() 11728 1726882235.35225: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.35230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.35243: variable 'omit' from source: magic vars 11728 1726882235.35531: variable 'ansible_distribution_major_version' from source: facts 11728 1726882235.35540: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882235.35626: variable 'network_provider' from source: set_fact 11728 1726882235.35629: Evaluated conditional (network_provider == "nm"): True 11728 1726882235.35692: variable '__network_wpa_supplicant_required' from source: role '' defaults 11728 1726882235.35756: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11728 1726882235.35875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882235.37348: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882235.37389: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882235.37422: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882235.37447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882235.37467: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882235.37646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882235.37665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882235.37682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882235.37713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882235.37725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882235.37760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882235.37777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882235.37794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882235.37822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882235.37832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882235.37863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882235.37878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882235.37896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882235.37922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882235.37932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882235.38029: variable 'network_connections' from source: task vars 11728 1726882235.38039: variable 'port2_profile' from source: play vars 11728 1726882235.38086: variable 'port2_profile' from source: play vars 11728 1726882235.38096: variable 'port1_profile' from source: play vars 11728 1726882235.38140: variable 'port1_profile' from source: play vars 11728 1726882235.38147: variable 'controller_profile' from source: play vars 11728 1726882235.38191: variable 'controller_profile' from source: play vars 11728 1726882235.38241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11728 1726882235.38351: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11728 1726882235.38377: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11728 1726882235.38406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11728 1726882235.38429: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11728 1726882235.38459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11728 1726882235.38474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11728 1726882235.38490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882235.38516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11728 1726882235.38554: variable '__network_wireless_connections_defined' from source: role '' defaults 11728 1726882235.38720: variable 'network_connections' from source: task vars 11728 1726882235.38723: variable 'port2_profile' from source: play vars 11728 1726882235.38763: variable 'port2_profile' from source: play vars 11728 1726882235.38769: variable 'port1_profile' from source: play vars 11728 1726882235.38814: variable 'port1_profile' from source: play vars 11728 1726882235.38828: variable 'controller_profile' from source: play vars 11728 1726882235.38866: variable 'controller_profile' from source: play vars 11728 1726882235.38888: Evaluated conditional (__network_wpa_supplicant_required): False 11728 1726882235.38891: when evaluation is False, skipping this task 11728 1726882235.38896: _execute() done 11728 1726882235.38900: dumping result to json 11728 1726882235.38903: done dumping result, returning 11728 1726882235.38911: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-5c28-a762-000000000e17] 11728 1726882235.38916: sending task result for task 12673a56-9f93-5c28-a762-000000000e17 11728 1726882235.39002: done sending task result for task 12673a56-9f93-5c28-a762-000000000e17 11728 1726882235.39005: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11728 1726882235.39052: no more pending results, returning what we have 11728 1726882235.39056: results queue empty 11728 1726882235.39057: checking for any_errors_fatal 11728 1726882235.39075: done checking for any_errors_fatal 11728 1726882235.39076: checking for max_fail_percentage 11728 1726882235.39078: done checking for max_fail_percentage 11728 1726882235.39078: checking to see if all hosts have failed and the running result is not ok 11728 1726882235.39079: done checking to see if all hosts have failed 11728 1726882235.39080: getting the remaining hosts for this loop 11728 1726882235.39081: done getting the remaining hosts for this loop 11728 1726882235.39085: getting the next task for host managed_node3 11728 1726882235.39092: done getting next task for host managed_node3 11728 1726882235.39098: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882235.39103: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882235.39127: getting variables 11728 1726882235.39129: in VariableManager get_vars() 11728 1726882235.39173: Calling all_inventory to load vars for managed_node3 11728 1726882235.39176: Calling groups_inventory to load vars for managed_node3 11728 1726882235.39178: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882235.39187: Calling all_plugins_play to load vars for managed_node3 11728 1726882235.39189: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882235.39191: Calling groups_plugins_play to load vars for managed_node3 11728 1726882235.40070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882235.40945: done with get_vars() 11728 1726882235.40960: done getting variables 11728 1726882235.41006: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:30:35 -0400 (0:00:00.065) 0:01:00.262 ****** 11728 1726882235.41029: entering _queue_task() for managed_node3/service 11728 1726882235.41269: worker is 1 (out of 1 available) 11728 1726882235.41284: exiting _queue_task() for managed_node3/service 11728 1726882235.41301: done queuing things up, now waiting for results queue to drain 11728 1726882235.41303: waiting for pending results... 11728 1726882235.41487: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11728 1726882235.41588: in run() - task 12673a56-9f93-5c28-a762-000000000e18 11728 1726882235.41602: variable 'ansible_search_path' from source: unknown 11728 1726882235.41606: variable 'ansible_search_path' from source: unknown 11728 1726882235.41642: calling self._execute() 11728 1726882235.41711: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.41714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.41723: variable 'omit' from source: magic vars 11728 1726882235.42001: variable 'ansible_distribution_major_version' from source: facts 11728 1726882235.42011: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882235.42090: variable 'network_provider' from source: set_fact 11728 1726882235.42098: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882235.42101: when evaluation is False, skipping this task 11728 1726882235.42105: _execute() done 11728 1726882235.42107: dumping result to json 11728 1726882235.42110: done dumping result, returning 11728 1726882235.42113: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-5c28-a762-000000000e18] 11728 1726882235.42119: sending task result for task 12673a56-9f93-5c28-a762-000000000e18 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11728 1726882235.42251: no more pending results, returning what we have 11728 1726882235.42256: results queue empty 11728 1726882235.42257: checking for any_errors_fatal 11728 1726882235.42265: done checking for any_errors_fatal 11728 1726882235.42266: checking for max_fail_percentage 11728 1726882235.42268: done checking for max_fail_percentage 11728 1726882235.42268: checking to see if all hosts have failed and the running result is not ok 11728 1726882235.42269: done checking to see if all hosts have failed 11728 1726882235.42270: getting the remaining hosts for this loop 11728 1726882235.42271: done getting the remaining hosts for this loop 11728 1726882235.42274: getting the next task for host managed_node3 11728 1726882235.42281: done getting next task for host managed_node3 11728 1726882235.42285: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882235.42290: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882235.42315: getting variables 11728 1726882235.42317: in VariableManager get_vars() 11728 1726882235.42353: Calling all_inventory to load vars for managed_node3 11728 1726882235.42356: Calling groups_inventory to load vars for managed_node3 11728 1726882235.42358: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882235.42365: Calling all_plugins_play to load vars for managed_node3 11728 1726882235.42368: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882235.42370: Calling groups_plugins_play to load vars for managed_node3 11728 1726882235.42908: done sending task result for task 12673a56-9f93-5c28-a762-000000000e18 11728 1726882235.42911: WORKER PROCESS EXITING 11728 1726882235.43132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882235.44004: done with get_vars() 11728 1726882235.44018: done getting variables 11728 1726882235.44060: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:30:35 -0400 (0:00:00.030) 0:01:00.293 ****** 11728 1726882235.44084: entering _queue_task() for managed_node3/copy 11728 1726882235.44301: worker is 1 (out of 1 available) 11728 1726882235.44315: exiting _queue_task() for managed_node3/copy 11728 1726882235.44327: done queuing things up, now waiting for results queue to drain 11728 1726882235.44329: waiting for pending results... 11728 1726882235.44510: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11728 1726882235.44608: in run() - task 12673a56-9f93-5c28-a762-000000000e19 11728 1726882235.44618: variable 'ansible_search_path' from source: unknown 11728 1726882235.44621: variable 'ansible_search_path' from source: unknown 11728 1726882235.44648: calling self._execute() 11728 1726882235.44721: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.44725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.44733: variable 'omit' from source: magic vars 11728 1726882235.45003: variable 'ansible_distribution_major_version' from source: facts 11728 1726882235.45008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882235.45082: variable 'network_provider' from source: set_fact 11728 1726882235.45087: Evaluated conditional (network_provider == "initscripts"): False 11728 1726882235.45090: when evaluation is False, skipping this task 11728 1726882235.45098: _execute() done 11728 1726882235.45101: dumping result to json 11728 1726882235.45103: done dumping result, returning 11728 1726882235.45108: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-5c28-a762-000000000e19] 11728 1726882235.45111: sending task result for task 12673a56-9f93-5c28-a762-000000000e19 11728 1726882235.45202: done sending task result for task 12673a56-9f93-5c28-a762-000000000e19 11728 1726882235.45205: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11728 1726882235.45263: no more pending results, returning what we have 11728 1726882235.45266: results queue empty 11728 1726882235.45267: checking for any_errors_fatal 11728 1726882235.45272: done checking for any_errors_fatal 11728 1726882235.45273: checking for max_fail_percentage 11728 1726882235.45274: done checking for max_fail_percentage 11728 1726882235.45275: checking to see if all hosts have failed and the running result is not ok 11728 1726882235.45275: done checking to see if all hosts have failed 11728 1726882235.45276: getting the remaining hosts for this loop 11728 1726882235.45277: done getting the remaining hosts for this loop 11728 1726882235.45280: getting the next task for host managed_node3 11728 1726882235.45286: done getting next task for host managed_node3 11728 1726882235.45289: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882235.45298: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882235.45316: getting variables 11728 1726882235.45317: in VariableManager get_vars() 11728 1726882235.45354: Calling all_inventory to load vars for managed_node3 11728 1726882235.45357: Calling groups_inventory to load vars for managed_node3 11728 1726882235.45359: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882235.45366: Calling all_plugins_play to load vars for managed_node3 11728 1726882235.45369: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882235.45371: Calling groups_plugins_play to load vars for managed_node3 11728 1726882235.46200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882235.47099: done with get_vars() 11728 1726882235.47116: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:30:35 -0400 (0:00:00.030) 0:01:00.324 ****** 11728 1726882235.47172: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882235.47386: worker is 1 (out of 1 available) 11728 1726882235.47403: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11728 1726882235.47416: done queuing things up, now waiting for results queue to drain 11728 1726882235.47417: waiting for pending results... 11728 1726882235.47588: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11728 1726882235.47685: in run() - task 12673a56-9f93-5c28-a762-000000000e1a 11728 1726882235.47701: variable 'ansible_search_path' from source: unknown 11728 1726882235.47705: variable 'ansible_search_path' from source: unknown 11728 1726882235.47731: calling self._execute() 11728 1726882235.47803: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.47807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.47816: variable 'omit' from source: magic vars 11728 1726882235.48079: variable 'ansible_distribution_major_version' from source: facts 11728 1726882235.48089: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882235.48097: variable 'omit' from source: magic vars 11728 1726882235.48140: variable 'omit' from source: magic vars 11728 1726882235.48251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11728 1726882235.50036: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11728 1726882235.50077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11728 1726882235.50108: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11728 1726882235.50136: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11728 1726882235.50155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11728 1726882235.50214: variable 'network_provider' from source: set_fact 11728 1726882235.50306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11728 1726882235.50327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11728 1726882235.50347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11728 1726882235.50372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11728 1726882235.50383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11728 1726882235.50439: variable 'omit' from source: magic vars 11728 1726882235.50511: variable 'omit' from source: magic vars 11728 1726882235.50581: variable 'network_connections' from source: task vars 11728 1726882235.50592: variable 'port2_profile' from source: play vars 11728 1726882235.50637: variable 'port2_profile' from source: play vars 11728 1726882235.50645: variable 'port1_profile' from source: play vars 11728 1726882235.50687: variable 'port1_profile' from source: play vars 11728 1726882235.50698: variable 'controller_profile' from source: play vars 11728 1726882235.50737: variable 'controller_profile' from source: play vars 11728 1726882235.50848: variable 'omit' from source: magic vars 11728 1726882235.50855: variable '__lsr_ansible_managed' from source: task vars 11728 1726882235.50902: variable '__lsr_ansible_managed' from source: task vars 11728 1726882235.51030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11728 1726882235.51167: Loaded config def from plugin (lookup/template) 11728 1726882235.51171: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11728 1726882235.51191: File lookup term: get_ansible_managed.j2 11728 1726882235.51199: variable 'ansible_search_path' from source: unknown 11728 1726882235.51204: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11728 1726882235.51214: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11728 1726882235.51227: variable 'ansible_search_path' from source: unknown 11728 1726882235.60101: variable 'ansible_managed' from source: unknown 11728 1726882235.60499: variable 'omit' from source: magic vars 11728 1726882235.60503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882235.60506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882235.60508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882235.60510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882235.60512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882235.60514: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882235.60516: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.60518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.60898: Set connection var ansible_connection to ssh 11728 1726882235.60902: Set connection var ansible_shell_executable to /bin/sh 11728 1726882235.60904: Set connection var ansible_timeout to 10 11728 1726882235.60906: Set connection var ansible_shell_type to sh 11728 1726882235.60908: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882235.60910: Set connection var ansible_pipelining to False 11728 1726882235.60912: variable 'ansible_shell_executable' from source: unknown 11728 1726882235.60914: variable 'ansible_connection' from source: unknown 11728 1726882235.60916: variable 'ansible_module_compression' from source: unknown 11728 1726882235.60918: variable 'ansible_shell_type' from source: unknown 11728 1726882235.60920: variable 'ansible_shell_executable' from source: unknown 11728 1726882235.60922: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882235.60924: variable 'ansible_pipelining' from source: unknown 11728 1726882235.60925: variable 'ansible_timeout' from source: unknown 11728 1726882235.60934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882235.61152: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882235.61168: variable 'omit' from source: magic vars 11728 1726882235.61178: starting attempt loop 11728 1726882235.61185: running the handler 11728 1726882235.61204: _low_level_execute_command(): starting 11728 1726882235.61216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882235.62208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882235.62225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.62242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882235.62259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882235.62275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882235.62285: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882235.62303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.62323: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882235.62411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.62435: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.62523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.64197: stdout chunk (state=3): >>>/root <<< 11728 1726882235.64441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882235.64472: stderr chunk (state=3): >>><<< 11728 1726882235.64481: stdout chunk (state=3): >>><<< 11728 1726882235.64520: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882235.64538: _low_level_execute_command(): starting 11728 1726882235.64548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512 `" && echo ansible-tmp-1726882235.6452715-14639-208531099424512="` echo /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512 `" ) && sleep 0' 11728 1726882235.65205: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882235.65218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.65283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.65342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882235.65364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.65388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.65466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.67473: stdout chunk (state=3): >>>ansible-tmp-1726882235.6452715-14639-208531099424512=/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512 <<< 11728 1726882235.67615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882235.67639: stderr chunk (state=3): >>><<< 11728 1726882235.67652: stdout chunk (state=3): >>><<< 11728 1726882235.67687: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882235.6452715-14639-208531099424512=/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882235.67746: variable 'ansible_module_compression' from source: unknown 11728 1726882235.67903: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11728 1726882235.67906: variable 'ansible_facts' from source: unknown 11728 1726882235.67998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py 11728 1726882235.68140: Sending initial data 11728 1726882235.68246: Sent initial data (168 bytes) 11728 1726882235.68790: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882235.68811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.68914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.68931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882235.68953: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.68968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.69042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.70561: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882235.70621: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882235.70688: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmptj5bc8qk /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py <<< 11728 1726882235.70705: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py" <<< 11728 1726882235.70736: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmptj5bc8qk" to remote "/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py" <<< 11728 1726882235.72056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882235.72059: stdout chunk (state=3): >>><<< 11728 1726882235.72062: stderr chunk (state=3): >>><<< 11728 1726882235.72077: done transferring module to remote 11728 1726882235.72106: _low_level_execute_command(): starting 11728 1726882235.72165: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/ /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py && sleep 0' 11728 1726882235.73078: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882235.73100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.73112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882235.73127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882235.73140: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882235.73147: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882235.73156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.73176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882235.73180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882235.73200: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882235.73203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.73205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882235.73215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882235.73288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882235.73299: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.73315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.73383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882235.75186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882235.75203: stderr chunk (state=3): >>><<< 11728 1726882235.75212: stdout chunk (state=3): >>><<< 11728 1726882235.75301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882235.75305: _low_level_execute_command(): starting 11728 1726882235.75307: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/AnsiballZ_network_connections.py && sleep 0' 11728 1726882235.75945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882235.75968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882235.75982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882235.76012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882235.76085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882235.76140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882235.76162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882235.76200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882235.76314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.27048: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11728 1726882236.27082: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/8458f044-ac0a-4f70-866e-afbf375585aa: error=unknown <<< 11728 1726882236.28779: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11728 1726882236.28783: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/5920b9c7-d6f0-4518-a9a4-f38b43b06206: error=unknown <<< 11728 1726882236.30503: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 11728 1726882236.30508: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/2dd6ee50-995a-4f49-bd7b-b3c1e472ace8: error=unknown <<< 11728 1726882236.30719: stdout chunk (state=3): >>> <<< 11728 1726882236.30728: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11728 1726882236.32575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882236.32608: stderr chunk (state=3): >>><<< 11728 1726882236.32611: stdout chunk (state=3): >>><<< 11728 1726882236.32628: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/8458f044-ac0a-4f70-866e-afbf375585aa: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/5920b9c7-d6f0-4518-a9a4-f38b43b06206: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_6t1034m7/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/2dd6ee50-995a-4f49-bd7b-b3c1e472ace8: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882236.32666: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882236.32675: _low_level_execute_command(): starting 11728 1726882236.32678: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882235.6452715-14639-208531099424512/ > /dev/null 2>&1 && sleep 0' 11728 1726882236.33135: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882236.33139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.33141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882236.33143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.33198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882236.33204: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.33208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.33249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.35065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.35097: stderr chunk (state=3): >>><<< 11728 1726882236.35101: stdout chunk (state=3): >>><<< 11728 1726882236.35115: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882236.35120: handler run complete 11728 1726882236.35144: attempt loop complete, returning result 11728 1726882236.35147: _execute() done 11728 1726882236.35150: dumping result to json 11728 1726882236.35155: done dumping result, returning 11728 1726882236.35163: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-5c28-a762-000000000e1a] 11728 1726882236.35168: sending task result for task 12673a56-9f93-5c28-a762-000000000e1a 11728 1726882236.35275: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1a 11728 1726882236.35278: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11728 1726882236.35387: no more pending results, returning what we have 11728 1726882236.35390: results queue empty 11728 1726882236.35391: checking for any_errors_fatal 11728 1726882236.35407: done checking for any_errors_fatal 11728 1726882236.35408: checking for max_fail_percentage 11728 1726882236.35410: done checking for max_fail_percentage 11728 1726882236.35411: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.35412: done checking to see if all hosts have failed 11728 1726882236.35412: getting the remaining hosts for this loop 11728 1726882236.35414: done getting the remaining hosts for this loop 11728 1726882236.35417: getting the next task for host managed_node3 11728 1726882236.35424: done getting next task for host managed_node3 11728 1726882236.35428: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882236.35433: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.35445: getting variables 11728 1726882236.35447: in VariableManager get_vars() 11728 1726882236.35488: Calling all_inventory to load vars for managed_node3 11728 1726882236.35500: Calling groups_inventory to load vars for managed_node3 11728 1726882236.35503: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.35518: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.35521: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.35524: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.36331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.37330: done with get_vars() 11728 1726882236.37348: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:30:36 -0400 (0:00:00.902) 0:01:01.226 ****** 11728 1726882236.37416: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882236.37672: worker is 1 (out of 1 available) 11728 1726882236.37687: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11728 1726882236.37705: done queuing things up, now waiting for results queue to drain 11728 1726882236.37707: waiting for pending results... 11728 1726882236.37887: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11728 1726882236.38000: in run() - task 12673a56-9f93-5c28-a762-000000000e1b 11728 1726882236.38011: variable 'ansible_search_path' from source: unknown 11728 1726882236.38015: variable 'ansible_search_path' from source: unknown 11728 1726882236.38046: calling self._execute() 11728 1726882236.38118: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.38122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.38131: variable 'omit' from source: magic vars 11728 1726882236.38408: variable 'ansible_distribution_major_version' from source: facts 11728 1726882236.38419: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882236.38504: variable 'network_state' from source: role '' defaults 11728 1726882236.38513: Evaluated conditional (network_state != {}): False 11728 1726882236.38516: when evaluation is False, skipping this task 11728 1726882236.38519: _execute() done 11728 1726882236.38521: dumping result to json 11728 1726882236.38524: done dumping result, returning 11728 1726882236.38531: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-5c28-a762-000000000e1b] 11728 1726882236.38536: sending task result for task 12673a56-9f93-5c28-a762-000000000e1b 11728 1726882236.38623: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1b 11728 1726882236.38626: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11728 1726882236.38674: no more pending results, returning what we have 11728 1726882236.38678: results queue empty 11728 1726882236.38679: checking for any_errors_fatal 11728 1726882236.38688: done checking for any_errors_fatal 11728 1726882236.38688: checking for max_fail_percentage 11728 1726882236.38690: done checking for max_fail_percentage 11728 1726882236.38692: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.38692: done checking to see if all hosts have failed 11728 1726882236.38697: getting the remaining hosts for this loop 11728 1726882236.38699: done getting the remaining hosts for this loop 11728 1726882236.38702: getting the next task for host managed_node3 11728 1726882236.38709: done getting next task for host managed_node3 11728 1726882236.38713: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882236.38717: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.38744: getting variables 11728 1726882236.38746: in VariableManager get_vars() 11728 1726882236.38783: Calling all_inventory to load vars for managed_node3 11728 1726882236.38785: Calling groups_inventory to load vars for managed_node3 11728 1726882236.38787: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.38799: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.38802: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.38804: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.39557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.40432: done with get_vars() 11728 1726882236.40447: done getting variables 11728 1726882236.40489: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:30:36 -0400 (0:00:00.030) 0:01:01.257 ****** 11728 1726882236.40518: entering _queue_task() for managed_node3/debug 11728 1726882236.40736: worker is 1 (out of 1 available) 11728 1726882236.40750: exiting _queue_task() for managed_node3/debug 11728 1726882236.40762: done queuing things up, now waiting for results queue to drain 11728 1726882236.40763: waiting for pending results... 11728 1726882236.40950: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11728 1726882236.41055: in run() - task 12673a56-9f93-5c28-a762-000000000e1c 11728 1726882236.41066: variable 'ansible_search_path' from source: unknown 11728 1726882236.41070: variable 'ansible_search_path' from source: unknown 11728 1726882236.41101: calling self._execute() 11728 1726882236.41169: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.41174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.41181: variable 'omit' from source: magic vars 11728 1726882236.41448: variable 'ansible_distribution_major_version' from source: facts 11728 1726882236.41463: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882236.41467: variable 'omit' from source: magic vars 11728 1726882236.41512: variable 'omit' from source: magic vars 11728 1726882236.41537: variable 'omit' from source: magic vars 11728 1726882236.41573: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882236.41600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882236.41613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882236.41626: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.41638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.41661: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882236.41664: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.41666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.41734: Set connection var ansible_connection to ssh 11728 1726882236.41743: Set connection var ansible_shell_executable to /bin/sh 11728 1726882236.41749: Set connection var ansible_timeout to 10 11728 1726882236.41752: Set connection var ansible_shell_type to sh 11728 1726882236.41758: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882236.41763: Set connection var ansible_pipelining to False 11728 1726882236.41780: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.41784: variable 'ansible_connection' from source: unknown 11728 1726882236.41787: variable 'ansible_module_compression' from source: unknown 11728 1726882236.41789: variable 'ansible_shell_type' from source: unknown 11728 1726882236.41791: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.41798: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.41800: variable 'ansible_pipelining' from source: unknown 11728 1726882236.41803: variable 'ansible_timeout' from source: unknown 11728 1726882236.41808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.41904: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882236.41917: variable 'omit' from source: magic vars 11728 1726882236.41921: starting attempt loop 11728 1726882236.41924: running the handler 11728 1726882236.42016: variable '__network_connections_result' from source: set_fact 11728 1726882236.42057: handler run complete 11728 1726882236.42071: attempt loop complete, returning result 11728 1726882236.42073: _execute() done 11728 1726882236.42076: dumping result to json 11728 1726882236.42080: done dumping result, returning 11728 1726882236.42088: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-5c28-a762-000000000e1c] 11728 1726882236.42097: sending task result for task 12673a56-9f93-5c28-a762-000000000e1c 11728 1726882236.42176: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1c 11728 1726882236.42179: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 11728 1726882236.42254: no more pending results, returning what we have 11728 1726882236.42258: results queue empty 11728 1726882236.42259: checking for any_errors_fatal 11728 1726882236.42264: done checking for any_errors_fatal 11728 1726882236.42264: checking for max_fail_percentage 11728 1726882236.42266: done checking for max_fail_percentage 11728 1726882236.42266: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.42267: done checking to see if all hosts have failed 11728 1726882236.42268: getting the remaining hosts for this loop 11728 1726882236.42269: done getting the remaining hosts for this loop 11728 1726882236.42272: getting the next task for host managed_node3 11728 1726882236.42279: done getting next task for host managed_node3 11728 1726882236.42282: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882236.42290: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.42305: getting variables 11728 1726882236.42306: in VariableManager get_vars() 11728 1726882236.42342: Calling all_inventory to load vars for managed_node3 11728 1726882236.42345: Calling groups_inventory to load vars for managed_node3 11728 1726882236.42347: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.42354: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.42356: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.42359: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.43280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.44137: done with get_vars() 11728 1726882236.44153: done getting variables 11728 1726882236.44196: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:30:36 -0400 (0:00:00.037) 0:01:01.294 ****** 11728 1726882236.44226: entering _queue_task() for managed_node3/debug 11728 1726882236.44485: worker is 1 (out of 1 available) 11728 1726882236.44502: exiting _queue_task() for managed_node3/debug 11728 1726882236.44515: done queuing things up, now waiting for results queue to drain 11728 1726882236.44517: waiting for pending results... 11728 1726882236.44913: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11728 1726882236.45007: in run() - task 12673a56-9f93-5c28-a762-000000000e1d 11728 1726882236.45030: variable 'ansible_search_path' from source: unknown 11728 1726882236.45038: variable 'ansible_search_path' from source: unknown 11728 1726882236.45076: calling self._execute() 11728 1726882236.45178: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.45191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.45212: variable 'omit' from source: magic vars 11728 1726882236.45598: variable 'ansible_distribution_major_version' from source: facts 11728 1726882236.45618: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882236.45622: variable 'omit' from source: magic vars 11728 1726882236.45674: variable 'omit' from source: magic vars 11728 1726882236.45701: variable 'omit' from source: magic vars 11728 1726882236.45734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882236.45762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882236.45777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882236.45790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.45805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.45834: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882236.45838: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.45841: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.45905: Set connection var ansible_connection to ssh 11728 1726882236.45915: Set connection var ansible_shell_executable to /bin/sh 11728 1726882236.45918: Set connection var ansible_timeout to 10 11728 1726882236.45921: Set connection var ansible_shell_type to sh 11728 1726882236.45928: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882236.45932: Set connection var ansible_pipelining to False 11728 1726882236.45952: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.45956: variable 'ansible_connection' from source: unknown 11728 1726882236.45959: variable 'ansible_module_compression' from source: unknown 11728 1726882236.45961: variable 'ansible_shell_type' from source: unknown 11728 1726882236.45964: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.45966: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.45968: variable 'ansible_pipelining' from source: unknown 11728 1726882236.45970: variable 'ansible_timeout' from source: unknown 11728 1726882236.45973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.46080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882236.46089: variable 'omit' from source: magic vars 11728 1726882236.46096: starting attempt loop 11728 1726882236.46102: running the handler 11728 1726882236.46140: variable '__network_connections_result' from source: set_fact 11728 1726882236.46198: variable '__network_connections_result' from source: set_fact 11728 1726882236.46306: handler run complete 11728 1726882236.46326: attempt loop complete, returning result 11728 1726882236.46330: _execute() done 11728 1726882236.46332: dumping result to json 11728 1726882236.46337: done dumping result, returning 11728 1726882236.46345: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-5c28-a762-000000000e1d] 11728 1726882236.46349: sending task result for task 12673a56-9f93-5c28-a762-000000000e1d 11728 1726882236.46445: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1d 11728 1726882236.46448: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11728 1726882236.46540: no more pending results, returning what we have 11728 1726882236.46543: results queue empty 11728 1726882236.46544: checking for any_errors_fatal 11728 1726882236.46551: done checking for any_errors_fatal 11728 1726882236.46552: checking for max_fail_percentage 11728 1726882236.46553: done checking for max_fail_percentage 11728 1726882236.46554: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.46554: done checking to see if all hosts have failed 11728 1726882236.46555: getting the remaining hosts for this loop 11728 1726882236.46557: done getting the remaining hosts for this loop 11728 1726882236.46560: getting the next task for host managed_node3 11728 1726882236.46567: done getting next task for host managed_node3 11728 1726882236.46570: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882236.46575: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.46586: getting variables 11728 1726882236.46587: in VariableManager get_vars() 11728 1726882236.46627: Calling all_inventory to load vars for managed_node3 11728 1726882236.46635: Calling groups_inventory to load vars for managed_node3 11728 1726882236.46637: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.46645: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.46648: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.46650: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.47536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.49062: done with get_vars() 11728 1726882236.49083: done getting variables 11728 1726882236.49141: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:30:36 -0400 (0:00:00.049) 0:01:01.344 ****** 11728 1726882236.49175: entering _queue_task() for managed_node3/debug 11728 1726882236.49470: worker is 1 (out of 1 available) 11728 1726882236.49483: exiting _queue_task() for managed_node3/debug 11728 1726882236.49700: done queuing things up, now waiting for results queue to drain 11728 1726882236.49702: waiting for pending results... 11728 1726882236.49830: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11728 1726882236.50036: in run() - task 12673a56-9f93-5c28-a762-000000000e1e 11728 1726882236.50040: variable 'ansible_search_path' from source: unknown 11728 1726882236.50042: variable 'ansible_search_path' from source: unknown 11728 1726882236.50045: calling self._execute() 11728 1726882236.50109: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.50121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.50134: variable 'omit' from source: magic vars 11728 1726882236.50509: variable 'ansible_distribution_major_version' from source: facts 11728 1726882236.50526: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882236.50654: variable 'network_state' from source: role '' defaults 11728 1726882236.50670: Evaluated conditional (network_state != {}): False 11728 1726882236.50677: when evaluation is False, skipping this task 11728 1726882236.50688: _execute() done 11728 1726882236.50700: dumping result to json 11728 1726882236.50707: done dumping result, returning 11728 1726882236.50720: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-5c28-a762-000000000e1e] 11728 1726882236.50730: sending task result for task 12673a56-9f93-5c28-a762-000000000e1e skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11728 1726882236.50985: no more pending results, returning what we have 11728 1726882236.50989: results queue empty 11728 1726882236.50990: checking for any_errors_fatal 11728 1726882236.51006: done checking for any_errors_fatal 11728 1726882236.51007: checking for max_fail_percentage 11728 1726882236.51009: done checking for max_fail_percentage 11728 1726882236.51010: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.51011: done checking to see if all hosts have failed 11728 1726882236.51011: getting the remaining hosts for this loop 11728 1726882236.51013: done getting the remaining hosts for this loop 11728 1726882236.51016: getting the next task for host managed_node3 11728 1726882236.51025: done getting next task for host managed_node3 11728 1726882236.51028: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882236.51034: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.51060: getting variables 11728 1726882236.51062: in VariableManager get_vars() 11728 1726882236.51287: Calling all_inventory to load vars for managed_node3 11728 1726882236.51290: Calling groups_inventory to load vars for managed_node3 11728 1726882236.51297: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.51304: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1e 11728 1726882236.51306: WORKER PROCESS EXITING 11728 1726882236.51314: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.51317: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.51320: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.52799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.54317: done with get_vars() 11728 1726882236.54339: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:30:36 -0400 (0:00:00.052) 0:01:01.396 ****** 11728 1726882236.54437: entering _queue_task() for managed_node3/ping 11728 1726882236.54767: worker is 1 (out of 1 available) 11728 1726882236.54780: exiting _queue_task() for managed_node3/ping 11728 1726882236.54796: done queuing things up, now waiting for results queue to drain 11728 1726882236.54798: waiting for pending results... 11728 1726882236.55117: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11728 1726882236.55252: in run() - task 12673a56-9f93-5c28-a762-000000000e1f 11728 1726882236.55269: variable 'ansible_search_path' from source: unknown 11728 1726882236.55273: variable 'ansible_search_path' from source: unknown 11728 1726882236.55305: calling self._execute() 11728 1726882236.55377: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.55381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.55390: variable 'omit' from source: magic vars 11728 1726882236.55663: variable 'ansible_distribution_major_version' from source: facts 11728 1726882236.55672: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882236.55678: variable 'omit' from source: magic vars 11728 1726882236.55732: variable 'omit' from source: magic vars 11728 1726882236.55755: variable 'omit' from source: magic vars 11728 1726882236.55786: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882236.55820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882236.55834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882236.55848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.55858: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882236.55881: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882236.55884: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.55886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.55958: Set connection var ansible_connection to ssh 11728 1726882236.55966: Set connection var ansible_shell_executable to /bin/sh 11728 1726882236.55971: Set connection var ansible_timeout to 10 11728 1726882236.55974: Set connection var ansible_shell_type to sh 11728 1726882236.55980: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882236.55985: Set connection var ansible_pipelining to False 11728 1726882236.56007: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.56010: variable 'ansible_connection' from source: unknown 11728 1726882236.56013: variable 'ansible_module_compression' from source: unknown 11728 1726882236.56016: variable 'ansible_shell_type' from source: unknown 11728 1726882236.56020: variable 'ansible_shell_executable' from source: unknown 11728 1726882236.56023: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882236.56025: variable 'ansible_pipelining' from source: unknown 11728 1726882236.56027: variable 'ansible_timeout' from source: unknown 11728 1726882236.56029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882236.56175: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11728 1726882236.56184: variable 'omit' from source: magic vars 11728 1726882236.56188: starting attempt loop 11728 1726882236.56191: running the handler 11728 1726882236.56209: _low_level_execute_command(): starting 11728 1726882236.56215: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882236.57192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882236.57200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.57204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.57279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.57316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.57413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.59099: stdout chunk (state=3): >>>/root <<< 11728 1726882236.59230: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.59289: stderr chunk (state=3): >>><<< 11728 1726882236.59301: stdout chunk (state=3): >>><<< 11728 1726882236.59340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882236.59371: _low_level_execute_command(): starting 11728 1726882236.59382: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037 `" && echo ansible-tmp-1726882236.5934846-14689-58662280843037="` echo /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037 `" ) && sleep 0' 11728 1726882236.60070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882236.60085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882236.60136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.60226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882236.60278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.60375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.62603: stdout chunk (state=3): >>>ansible-tmp-1726882236.5934846-14689-58662280843037=/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037 <<< 11728 1726882236.62608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.62610: stdout chunk (state=3): >>><<< 11728 1726882236.62613: stderr chunk (state=3): >>><<< 11728 1726882236.62616: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882236.5934846-14689-58662280843037=/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882236.62637: variable 'ansible_module_compression' from source: unknown 11728 1726882236.62685: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11728 1726882236.62820: variable 'ansible_facts' from source: unknown 11728 1726882236.63034: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py 11728 1726882236.63487: Sending initial data 11728 1726882236.63491: Sent initial data (152 bytes) 11728 1726882236.64679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882236.64932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882236.65016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.65046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.65126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.66700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882236.66814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882236.66936: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmphcws7rro /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py <<< 11728 1726882236.66939: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py" <<< 11728 1726882236.66978: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmphcws7rro" to remote "/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py" <<< 11728 1726882236.68498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.68516: stdout chunk (state=3): >>><<< 11728 1726882236.68716: stderr chunk (state=3): >>><<< 11728 1726882236.68720: done transferring module to remote 11728 1726882236.68722: _low_level_execute_command(): starting 11728 1726882236.68725: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/ /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py && sleep 0' 11728 1726882236.69716: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.69775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882236.69801: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.69854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.70004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.71722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.71763: stderr chunk (state=3): >>><<< 11728 1726882236.71767: stdout chunk (state=3): >>><<< 11728 1726882236.71787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882236.71800: _low_level_execute_command(): starting 11728 1726882236.71803: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/AnsiballZ_ping.py && sleep 0' 11728 1726882236.72499: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882236.72503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882236.72505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882236.72508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882236.72510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882236.72512: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882236.72514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.72521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882236.72531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882236.72539: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882236.72553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882236.72564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882236.72577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882236.72704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.72707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.72757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.87731: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11728 1726882236.89112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882236.89115: stdout chunk (state=3): >>><<< 11728 1726882236.89117: stderr chunk (state=3): >>><<< 11728 1726882236.89240: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882236.89245: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882236.89248: _low_level_execute_command(): starting 11728 1726882236.89250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882236.5934846-14689-58662280843037/ > /dev/null 2>&1 && sleep 0' 11728 1726882236.89843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882236.89859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882236.89874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882236.89901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882236.89920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882236.89938: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882236.89954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.89977: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882236.90056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882236.90119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882236.90136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882236.90168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882236.90492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882236.92602: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882236.92606: stdout chunk (state=3): >>><<< 11728 1726882236.92608: stderr chunk (state=3): >>><<< 11728 1726882236.92611: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882236.92618: handler run complete 11728 1726882236.92620: attempt loop complete, returning result 11728 1726882236.92622: _execute() done 11728 1726882236.92624: dumping result to json 11728 1726882236.92626: done dumping result, returning 11728 1726882236.92628: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-5c28-a762-000000000e1f] 11728 1726882236.92629: sending task result for task 12673a56-9f93-5c28-a762-000000000e1f 11728 1726882236.92691: done sending task result for task 12673a56-9f93-5c28-a762-000000000e1f 11728 1726882236.92698: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11728 1726882236.92766: no more pending results, returning what we have 11728 1726882236.92770: results queue empty 11728 1726882236.92771: checking for any_errors_fatal 11728 1726882236.92777: done checking for any_errors_fatal 11728 1726882236.92778: checking for max_fail_percentage 11728 1726882236.92779: done checking for max_fail_percentage 11728 1726882236.92780: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.92781: done checking to see if all hosts have failed 11728 1726882236.92782: getting the remaining hosts for this loop 11728 1726882236.92783: done getting the remaining hosts for this loop 11728 1726882236.92787: getting the next task for host managed_node3 11728 1726882236.92802: done getting next task for host managed_node3 11728 1726882236.92805: ^ task is: TASK: meta (role_complete) 11728 1726882236.92811: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.92825: getting variables 11728 1726882236.92827: in VariableManager get_vars() 11728 1726882236.92875: Calling all_inventory to load vars for managed_node3 11728 1726882236.92878: Calling groups_inventory to load vars for managed_node3 11728 1726882236.92880: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882236.92890: Calling all_plugins_play to load vars for managed_node3 11728 1726882236.92892: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882236.93308: Calling groups_plugins_play to load vars for managed_node3 11728 1726882236.95937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882236.99663: done with get_vars() 11728 1726882236.99699: done getting variables 11728 1726882236.99927: done queuing things up, now waiting for results queue to drain 11728 1726882236.99929: results queue empty 11728 1726882236.99930: checking for any_errors_fatal 11728 1726882236.99934: done checking for any_errors_fatal 11728 1726882236.99935: checking for max_fail_percentage 11728 1726882236.99936: done checking for max_fail_percentage 11728 1726882236.99937: checking to see if all hosts have failed and the running result is not ok 11728 1726882236.99937: done checking to see if all hosts have failed 11728 1726882236.99938: getting the remaining hosts for this loop 11728 1726882236.99939: done getting the remaining hosts for this loop 11728 1726882236.99942: getting the next task for host managed_node3 11728 1726882236.99948: done getting next task for host managed_node3 11728 1726882236.99950: ^ task is: TASK: Delete the device '{{ controller_device }}' 11728 1726882236.99953: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=2, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882236.99956: getting variables 11728 1726882236.99957: in VariableManager get_vars() 11728 1726882237.00111: Calling all_inventory to load vars for managed_node3 11728 1726882237.00113: Calling groups_inventory to load vars for managed_node3 11728 1726882237.00115: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882237.00120: Calling all_plugins_play to load vars for managed_node3 11728 1726882237.00122: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882237.00124: Calling groups_plugins_play to load vars for managed_node3 11728 1726882237.03072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882237.06367: done with get_vars() 11728 1726882237.06649: done getting variables 11728 1726882237.06701: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11728 1726882237.07119: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/cleanup_bond_profile+device.yml:22 Friday 20 September 2024 21:30:37 -0400 (0:00:00.527) 0:01:01.923 ****** 11728 1726882237.07149: entering _queue_task() for managed_node3/command 11728 1726882237.08257: worker is 1 (out of 1 available) 11728 1726882237.08270: exiting _queue_task() for managed_node3/command 11728 1726882237.08284: done queuing things up, now waiting for results queue to drain 11728 1726882237.08287: waiting for pending results... 11728 1726882237.09013: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 11728 1726882237.09592: in run() - task 12673a56-9f93-5c28-a762-000000000e4f 11728 1726882237.09601: variable 'ansible_search_path' from source: unknown 11728 1726882237.09604: variable 'ansible_search_path' from source: unknown 11728 1726882237.09607: calling self._execute() 11728 1726882237.09674: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.09678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.09687: variable 'omit' from source: magic vars 11728 1726882237.10600: variable 'ansible_distribution_major_version' from source: facts 11728 1726882237.10604: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882237.10606: variable 'omit' from source: magic vars 11728 1726882237.10608: variable 'omit' from source: magic vars 11728 1726882237.10769: variable 'controller_device' from source: play vars 11728 1726882237.11098: variable 'omit' from source: magic vars 11728 1726882237.11102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882237.11104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882237.11298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882237.11300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882237.11302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882237.11304: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882237.11306: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.11307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.11309: Set connection var ansible_connection to ssh 11728 1726882237.11311: Set connection var ansible_shell_executable to /bin/sh 11728 1726882237.11507: Set connection var ansible_timeout to 10 11728 1726882237.11516: Set connection var ansible_shell_type to sh 11728 1726882237.11530: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882237.11539: Set connection var ansible_pipelining to False 11728 1726882237.11569: variable 'ansible_shell_executable' from source: unknown 11728 1726882237.11577: variable 'ansible_connection' from source: unknown 11728 1726882237.11584: variable 'ansible_module_compression' from source: unknown 11728 1726882237.11590: variable 'ansible_shell_type' from source: unknown 11728 1726882237.11603: variable 'ansible_shell_executable' from source: unknown 11728 1726882237.11610: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.11617: variable 'ansible_pipelining' from source: unknown 11728 1726882237.11625: variable 'ansible_timeout' from source: unknown 11728 1726882237.11632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.11778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882237.12012: variable 'omit' from source: magic vars 11728 1726882237.12024: starting attempt loop 11728 1726882237.12031: running the handler 11728 1726882237.12052: _low_level_execute_command(): starting 11728 1726882237.12067: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882237.13584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.13604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.13618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.13799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.13913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.13943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.15610: stdout chunk (state=3): >>>/root <<< 11728 1726882237.15689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.15759: stderr chunk (state=3): >>><<< 11728 1726882237.15768: stdout chunk (state=3): >>><<< 11728 1726882237.15796: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.15819: _low_level_execute_command(): starting 11728 1726882237.15861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394 `" && echo ansible-tmp-1726882237.1580498-14718-235593976969394="` echo /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394 `" ) && sleep 0' 11728 1726882237.16999: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.17112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.17353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.17360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.17363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.17417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.19285: stdout chunk (state=3): >>>ansible-tmp-1726882237.1580498-14718-235593976969394=/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394 <<< 11728 1726882237.19423: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.19427: stderr chunk (state=3): >>><<< 11728 1726882237.19429: stdout chunk (state=3): >>><<< 11728 1726882237.19445: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882237.1580498-14718-235593976969394=/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.19803: variable 'ansible_module_compression' from source: unknown 11728 1726882237.19806: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882237.19820: variable 'ansible_facts' from source: unknown 11728 1726882237.20026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py 11728 1726882237.20415: Sending initial data 11728 1726882237.20425: Sent initial data (156 bytes) 11728 1726882237.21309: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.21334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.21338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882237.21445: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882237.21543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.21759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.21771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.22161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.23698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882237.23726: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882237.23795: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjjdf79t5 /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py <<< 11728 1726882237.23806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py" <<< 11728 1726882237.24028: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpjjdf79t5" to remote "/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py" <<< 11728 1726882237.25155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.25167: stdout chunk (state=3): >>><<< 11728 1726882237.25174: stderr chunk (state=3): >>><<< 11728 1726882237.25233: done transferring module to remote 11728 1726882237.25244: _low_level_execute_command(): starting 11728 1726882237.25250: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/ /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py && sleep 0' 11728 1726882237.26292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.26310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.26525: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.26613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.26632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.26723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.28473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.28485: stdout chunk (state=3): >>><<< 11728 1726882237.28701: stderr chunk (state=3): >>><<< 11728 1726882237.28704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.28708: _low_level_execute_command(): starting 11728 1726882237.28711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/AnsiballZ_command.py && sleep 0' 11728 1726882237.29681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882237.29707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.29724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.29812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.29844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.29863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.29886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.30040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.45937: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:37.450169", "end": "2024-09-20 21:30:37.457471", "delta": "0:00:00.007302", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882237.47725: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. <<< 11728 1726882237.47729: stdout chunk (state=3): >>><<< 11728 1726882237.47732: stderr chunk (state=3): >>><<< 11728 1726882237.47734: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:37.450169", "end": "2024-09-20 21:30:37.457471", "delta": "0:00:00.007302", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.229 closed. 11728 1726882237.47737: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882237.47740: _low_level_execute_command(): starting 11728 1726882237.47742: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882237.1580498-14718-235593976969394/ > /dev/null 2>&1 && sleep 0' 11728 1726882237.48830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.49061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.49113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.49147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.51092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.51403: stderr chunk (state=3): >>><<< 11728 1726882237.51406: stdout chunk (state=3): >>><<< 11728 1726882237.51409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.51411: handler run complete 11728 1726882237.51414: Evaluated conditional (False): False 11728 1726882237.51416: Evaluated conditional (False): False 11728 1726882237.51418: attempt loop complete, returning result 11728 1726882237.51420: _execute() done 11728 1726882237.51422: dumping result to json 11728 1726882237.51424: done dumping result, returning 11728 1726882237.51426: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [12673a56-9f93-5c28-a762-000000000e4f] 11728 1726882237.51428: sending task result for task 12673a56-9f93-5c28-a762-000000000e4f 11728 1726882237.51507: done sending task result for task 12673a56-9f93-5c28-a762-000000000e4f ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007302", "end": "2024-09-20 21:30:37.457471", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:30:37.450169" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11728 1726882237.51580: no more pending results, returning what we have 11728 1726882237.51585: results queue empty 11728 1726882237.51586: checking for any_errors_fatal 11728 1726882237.51588: done checking for any_errors_fatal 11728 1726882237.51589: checking for max_fail_percentage 11728 1726882237.51591: done checking for max_fail_percentage 11728 1726882237.51592: checking to see if all hosts have failed and the running result is not ok 11728 1726882237.51597: done checking to see if all hosts have failed 11728 1726882237.51598: getting the remaining hosts for this loop 11728 1726882237.51600: done getting the remaining hosts for this loop 11728 1726882237.51603: getting the next task for host managed_node3 11728 1726882237.51614: done getting next task for host managed_node3 11728 1726882237.51617: ^ task is: TASK: Remove test interfaces 11728 1726882237.51622: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882237.51627: getting variables 11728 1726882237.51628: in VariableManager get_vars() 11728 1726882237.51679: Calling all_inventory to load vars for managed_node3 11728 1726882237.51682: Calling groups_inventory to load vars for managed_node3 11728 1726882237.51685: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882237.52002: Calling all_plugins_play to load vars for managed_node3 11728 1726882237.52007: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882237.52012: Calling groups_plugins_play to load vars for managed_node3 11728 1726882237.53019: WORKER PROCESS EXITING 11728 1726882237.54926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882237.57729: done with get_vars() 11728 1726882237.57753: done getting variables 11728 1726882237.57822: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:30:37 -0400 (0:00:00.507) 0:01:02.431 ****** 11728 1726882237.57861: entering _queue_task() for managed_node3/shell 11728 1726882237.58248: worker is 1 (out of 1 available) 11728 1726882237.58261: exiting _queue_task() for managed_node3/shell 11728 1726882237.58388: done queuing things up, now waiting for results queue to drain 11728 1726882237.58390: waiting for pending results... 11728 1726882237.58623: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 11728 1726882237.58763: in run() - task 12673a56-9f93-5c28-a762-000000000e55 11728 1726882237.58785: variable 'ansible_search_path' from source: unknown 11728 1726882237.58797: variable 'ansible_search_path' from source: unknown 11728 1726882237.58846: calling self._execute() 11728 1726882237.58966: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.58979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.58997: variable 'omit' from source: magic vars 11728 1726882237.59408: variable 'ansible_distribution_major_version' from source: facts 11728 1726882237.59426: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882237.59438: variable 'omit' from source: magic vars 11728 1726882237.59497: variable 'omit' from source: magic vars 11728 1726882237.59661: variable 'dhcp_interface1' from source: play vars 11728 1726882237.59673: variable 'dhcp_interface2' from source: play vars 11728 1726882237.59709: variable 'omit' from source: magic vars 11728 1726882237.59755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882237.59803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882237.59833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882237.59857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882237.59874: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882237.59917: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882237.59924: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.59930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.60048: Set connection var ansible_connection to ssh 11728 1726882237.60051: Set connection var ansible_shell_executable to /bin/sh 11728 1726882237.60053: Set connection var ansible_timeout to 10 11728 1726882237.60057: Set connection var ansible_shell_type to sh 11728 1726882237.60067: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882237.60074: Set connection var ansible_pipelining to False 11728 1726882237.60125: variable 'ansible_shell_executable' from source: unknown 11728 1726882237.60128: variable 'ansible_connection' from source: unknown 11728 1726882237.60130: variable 'ansible_module_compression' from source: unknown 11728 1726882237.60131: variable 'ansible_shell_type' from source: unknown 11728 1726882237.60133: variable 'ansible_shell_executable' from source: unknown 11728 1726882237.60135: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882237.60137: variable 'ansible_pipelining' from source: unknown 11728 1726882237.60156: variable 'ansible_timeout' from source: unknown 11728 1726882237.60159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882237.60292: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882237.60341: variable 'omit' from source: magic vars 11728 1726882237.60344: starting attempt loop 11728 1726882237.60346: running the handler 11728 1726882237.60348: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882237.60363: _low_level_execute_command(): starting 11728 1726882237.60379: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882237.61156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882237.61212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.61303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.61345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.61398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.63660: stdout chunk (state=3): >>>/root <<< 11728 1726882237.63664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.63666: stdout chunk (state=3): >>><<< 11728 1726882237.63669: stderr chunk (state=3): >>><<< 11728 1726882237.63691: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.63753: _low_level_execute_command(): starting 11728 1726882237.63757: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681 `" && echo ansible-tmp-1726882237.6369092-14738-132758573212681="` echo /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681 `" ) && sleep 0' 11728 1726882237.65109: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.65174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.65249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.67154: stdout chunk (state=3): >>>ansible-tmp-1726882237.6369092-14738-132758573212681=/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681 <<< 11728 1726882237.67237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.67241: stderr chunk (state=3): >>><<< 11728 1726882237.67243: stdout chunk (state=3): >>><<< 11728 1726882237.67267: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882237.6369092-14738-132758573212681=/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.67306: variable 'ansible_module_compression' from source: unknown 11728 1726882237.67361: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882237.67522: variable 'ansible_facts' from source: unknown 11728 1726882237.67733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py 11728 1726882237.68137: Sending initial data 11728 1726882237.68140: Sent initial data (156 bytes) 11728 1726882237.69203: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882237.69321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.69330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882237.69461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.69568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.69656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.71340: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882237.71382: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882237.71422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp1_w2pf5x /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py <<< 11728 1726882237.71426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py" <<< 11728 1726882237.71832: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp1_w2pf5x" to remote "/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py" <<< 11728 1726882237.73376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.73401: stderr chunk (state=3): >>><<< 11728 1726882237.73405: stdout chunk (state=3): >>><<< 11728 1726882237.73428: done transferring module to remote 11728 1726882237.73439: _low_level_execute_command(): starting 11728 1726882237.73444: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/ /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py && sleep 0' 11728 1726882237.74668: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882237.74904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.74924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.75005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.75009: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.76967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882237.76971: stdout chunk (state=3): >>><<< 11728 1726882237.76979: stderr chunk (state=3): >>><<< 11728 1726882237.77003: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882237.77010: _low_level_execute_command(): starting 11728 1726882237.77013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/AnsiballZ_command.py && sleep 0' 11728 1726882237.78041: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882237.78045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.78047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.78050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882237.78052: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882237.78054: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882237.78056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.78059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882237.78061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882237.78063: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882237.78065: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882237.78067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.78069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882237.78071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882237.78073: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882237.78074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882237.78076: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882237.78078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882237.78080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882237.78259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882237.97779: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:37.929791", "end": "2024-09-20 21:30:37.976135", "delta": "0:00:00.046344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882237.99483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882237.99505: stderr chunk (state=3): >>><<< 11728 1726882237.99508: stdout chunk (state=3): >>><<< 11728 1726882237.99523: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:37.929791", "end": "2024-09-20 21:30:37.976135", "delta": "0:00:00.046344", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882237.99559: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882237.99566: _low_level_execute_command(): starting 11728 1726882237.99569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882237.6369092-14738-132758573212681/ > /dev/null 2>&1 && sleep 0' 11728 1726882237.99962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882237.99965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.00003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.00006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.00008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882238.00010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.00055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.00058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.00113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.01937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.01959: stderr chunk (state=3): >>><<< 11728 1726882238.01962: stdout chunk (state=3): >>><<< 11728 1726882238.01976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.01982: handler run complete 11728 1726882238.02006: Evaluated conditional (False): False 11728 1726882238.02014: attempt loop complete, returning result 11728 1726882238.02017: _execute() done 11728 1726882238.02019: dumping result to json 11728 1726882238.02025: done dumping result, returning 11728 1726882238.02033: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [12673a56-9f93-5c28-a762-000000000e55] 11728 1726882238.02038: sending task result for task 12673a56-9f93-5c28-a762-000000000e55 11728 1726882238.02135: done sending task result for task 12673a56-9f93-5c28-a762-000000000e55 11728 1726882238.02137: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.046344", "end": "2024-09-20 21:30:37.976135", "rc": 0, "start": "2024-09-20 21:30:37.929791" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11728 1726882238.02211: no more pending results, returning what we have 11728 1726882238.02215: results queue empty 11728 1726882238.02215: checking for any_errors_fatal 11728 1726882238.02230: done checking for any_errors_fatal 11728 1726882238.02231: checking for max_fail_percentage 11728 1726882238.02232: done checking for max_fail_percentage 11728 1726882238.02233: checking to see if all hosts have failed and the running result is not ok 11728 1726882238.02234: done checking to see if all hosts have failed 11728 1726882238.02235: getting the remaining hosts for this loop 11728 1726882238.02236: done getting the remaining hosts for this loop 11728 1726882238.02240: getting the next task for host managed_node3 11728 1726882238.02248: done getting next task for host managed_node3 11728 1726882238.02251: ^ task is: TASK: Stop dnsmasq/radvd services 11728 1726882238.02254: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=3, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882238.02258: getting variables 11728 1726882238.02260: in VariableManager get_vars() 11728 1726882238.02308: Calling all_inventory to load vars for managed_node3 11728 1726882238.02311: Calling groups_inventory to load vars for managed_node3 11728 1726882238.02313: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882238.02323: Calling all_plugins_play to load vars for managed_node3 11728 1726882238.02326: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882238.02328: Calling groups_plugins_play to load vars for managed_node3 11728 1726882238.03132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882238.03982: done with get_vars() 11728 1726882238.04002: done getting variables 11728 1726882238.04047: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:30:38 -0400 (0:00:00.462) 0:01:02.893 ****** 11728 1726882238.04070: entering _queue_task() for managed_node3/shell 11728 1726882238.04314: worker is 1 (out of 1 available) 11728 1726882238.04328: exiting _queue_task() for managed_node3/shell 11728 1726882238.04342: done queuing things up, now waiting for results queue to drain 11728 1726882238.04343: waiting for pending results... 11728 1726882238.04537: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 11728 1726882238.04609: in run() - task 12673a56-9f93-5c28-a762-000000000e56 11728 1726882238.04621: variable 'ansible_search_path' from source: unknown 11728 1726882238.04625: variable 'ansible_search_path' from source: unknown 11728 1726882238.04651: calling self._execute() 11728 1726882238.04735: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.04739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.04748: variable 'omit' from source: magic vars 11728 1726882238.05024: variable 'ansible_distribution_major_version' from source: facts 11728 1726882238.05034: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882238.05040: variable 'omit' from source: magic vars 11728 1726882238.05071: variable 'omit' from source: magic vars 11728 1726882238.05096: variable 'omit' from source: magic vars 11728 1726882238.05130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882238.05157: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882238.05172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882238.05185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.05197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.05229: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882238.05232: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.05235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.05297: Set connection var ansible_connection to ssh 11728 1726882238.05308: Set connection var ansible_shell_executable to /bin/sh 11728 1726882238.05313: Set connection var ansible_timeout to 10 11728 1726882238.05316: Set connection var ansible_shell_type to sh 11728 1726882238.05323: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882238.05327: Set connection var ansible_pipelining to False 11728 1726882238.05349: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.05352: variable 'ansible_connection' from source: unknown 11728 1726882238.05355: variable 'ansible_module_compression' from source: unknown 11728 1726882238.05358: variable 'ansible_shell_type' from source: unknown 11728 1726882238.05360: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.05362: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.05364: variable 'ansible_pipelining' from source: unknown 11728 1726882238.05366: variable 'ansible_timeout' from source: unknown 11728 1726882238.05368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.05473: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.05483: variable 'omit' from source: magic vars 11728 1726882238.05488: starting attempt loop 11728 1726882238.05490: running the handler 11728 1726882238.05504: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.05520: _low_level_execute_command(): starting 11728 1726882238.05527: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882238.06049: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.06053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.06057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.06112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.06115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.06120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.06166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.07789: stdout chunk (state=3): >>>/root <<< 11728 1726882238.07890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.07917: stderr chunk (state=3): >>><<< 11728 1726882238.07921: stdout chunk (state=3): >>><<< 11728 1726882238.07938: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.07950: _low_level_execute_command(): starting 11728 1726882238.07956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634 `" && echo ansible-tmp-1726882238.0793903-14769-1458605102634="` echo /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634 `" ) && sleep 0' 11728 1726882238.08385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.08388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.08401: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882238.08403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882238.08406: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.08448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.08451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.08506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.10363: stdout chunk (state=3): >>>ansible-tmp-1726882238.0793903-14769-1458605102634=/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634 <<< 11728 1726882238.10468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.10496: stderr chunk (state=3): >>><<< 11728 1726882238.10500: stdout chunk (state=3): >>><<< 11728 1726882238.10514: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882238.0793903-14769-1458605102634=/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.10537: variable 'ansible_module_compression' from source: unknown 11728 1726882238.10579: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882238.10614: variable 'ansible_facts' from source: unknown 11728 1726882238.10667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py 11728 1726882238.10765: Sending initial data 11728 1726882238.10769: Sent initial data (154 bytes) 11728 1726882238.11184: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.11198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.11202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882238.11235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.11239: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882238.11241: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.11244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.11287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.11290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.11342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.12837: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11728 1726882238.12841: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882238.12879: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882238.12924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp3f_h8zy5 /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py <<< 11728 1726882238.12928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py" <<< 11728 1726882238.12971: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp3f_h8zy5" to remote "/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py" <<< 11728 1726882238.12973: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py" <<< 11728 1726882238.13527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.13565: stderr chunk (state=3): >>><<< 11728 1726882238.13569: stdout chunk (state=3): >>><<< 11728 1726882238.13613: done transferring module to remote 11728 1726882238.13621: _low_level_execute_command(): starting 11728 1726882238.13626: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/ /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py && sleep 0' 11728 1726882238.14065: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.14068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882238.14070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration <<< 11728 1726882238.14076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882238.14078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.14118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.14121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.14173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.15851: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.15873: stderr chunk (state=3): >>><<< 11728 1726882238.15876: stdout chunk (state=3): >>><<< 11728 1726882238.15891: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.15899: _low_level_execute_command(): starting 11728 1726882238.15901: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/AnsiballZ_command.py && sleep 0' 11728 1726882238.16330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.16333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.16335: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.16338: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.16340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.16380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.16383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.16442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.34084: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:38.311762", "end": "2024-09-20 21:30:38.337837", "delta": "0:00:00.026075", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882238.35703: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882238.35707: stdout chunk (state=3): >>><<< 11728 1726882238.35709: stderr chunk (state=3): >>><<< 11728 1726882238.35712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:38.311762", "end": "2024-09-20 21:30:38.337837", "delta": "0:00:00.026075", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882238.35720: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882238.35722: _low_level_execute_command(): starting 11728 1726882238.35725: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882238.0793903-14769-1458605102634/ > /dev/null 2>&1 && sleep 0' 11728 1726882238.36375: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.36501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.36519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.36541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.36556: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.36580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.36669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.38490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.38516: stdout chunk (state=3): >>><<< 11728 1726882238.38528: stderr chunk (state=3): >>><<< 11728 1726882238.38604: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.38610: handler run complete 11728 1726882238.38613: Evaluated conditional (False): False 11728 1726882238.38615: attempt loop complete, returning result 11728 1726882238.38617: _execute() done 11728 1726882238.38619: dumping result to json 11728 1726882238.38633: done dumping result, returning 11728 1726882238.38645: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [12673a56-9f93-5c28-a762-000000000e56] 11728 1726882238.38655: sending task result for task 12673a56-9f93-5c28-a762-000000000e56 11728 1726882238.38823: done sending task result for task 12673a56-9f93-5c28-a762-000000000e56 11728 1726882238.38826: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026075", "end": "2024-09-20 21:30:38.337837", "rc": 0, "start": "2024-09-20 21:30:38.311762" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11728 1726882238.39053: no more pending results, returning what we have 11728 1726882238.39057: results queue empty 11728 1726882238.39058: checking for any_errors_fatal 11728 1726882238.39068: done checking for any_errors_fatal 11728 1726882238.39069: checking for max_fail_percentage 11728 1726882238.39071: done checking for max_fail_percentage 11728 1726882238.39072: checking to see if all hosts have failed and the running result is not ok 11728 1726882238.39073: done checking to see if all hosts have failed 11728 1726882238.39074: getting the remaining hosts for this loop 11728 1726882238.39076: done getting the remaining hosts for this loop 11728 1726882238.39079: getting the next task for host managed_node3 11728 1726882238.39091: done getting next task for host managed_node3 11728 1726882238.39096: ^ task is: TASK: Check routes and DNS 11728 1726882238.39101: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882238.39106: getting variables 11728 1726882238.39107: in VariableManager get_vars() 11728 1726882238.39158: Calling all_inventory to load vars for managed_node3 11728 1726882238.39162: Calling groups_inventory to load vars for managed_node3 11728 1726882238.39171: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882238.39184: Calling all_plugins_play to load vars for managed_node3 11728 1726882238.39187: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882238.39191: Calling groups_plugins_play to load vars for managed_node3 11728 1726882238.41059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882238.42656: done with get_vars() 11728 1726882238.42679: done getting variables 11728 1726882238.42744: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:30:38 -0400 (0:00:00.387) 0:01:03.280 ****** 11728 1726882238.42776: entering _queue_task() for managed_node3/shell 11728 1726882238.43109: worker is 1 (out of 1 available) 11728 1726882238.43121: exiting _queue_task() for managed_node3/shell 11728 1726882238.43133: done queuing things up, now waiting for results queue to drain 11728 1726882238.43135: waiting for pending results... 11728 1726882238.43518: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 11728 1726882238.43572: in run() - task 12673a56-9f93-5c28-a762-000000000e5a 11728 1726882238.43600: variable 'ansible_search_path' from source: unknown 11728 1726882238.43614: variable 'ansible_search_path' from source: unknown 11728 1726882238.43656: calling self._execute() 11728 1726882238.43768: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.43779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.43798: variable 'omit' from source: magic vars 11728 1726882238.44205: variable 'ansible_distribution_major_version' from source: facts 11728 1726882238.44223: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882238.44264: variable 'omit' from source: magic vars 11728 1726882238.44300: variable 'omit' from source: magic vars 11728 1726882238.44340: variable 'omit' from source: magic vars 11728 1726882238.44390: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882238.44482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882238.44486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882238.44488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.44502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.44537: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882238.44547: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.44555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.44664: Set connection var ansible_connection to ssh 11728 1726882238.44680: Set connection var ansible_shell_executable to /bin/sh 11728 1726882238.44690: Set connection var ansible_timeout to 10 11728 1726882238.44810: Set connection var ansible_shell_type to sh 11728 1726882238.44813: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882238.44816: Set connection var ansible_pipelining to False 11728 1726882238.44818: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.44820: variable 'ansible_connection' from source: unknown 11728 1726882238.44823: variable 'ansible_module_compression' from source: unknown 11728 1726882238.44825: variable 'ansible_shell_type' from source: unknown 11728 1726882238.44826: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.44828: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.44830: variable 'ansible_pipelining' from source: unknown 11728 1726882238.44832: variable 'ansible_timeout' from source: unknown 11728 1726882238.44834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.44935: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.44956: variable 'omit' from source: magic vars 11728 1726882238.44966: starting attempt loop 11728 1726882238.44974: running the handler 11728 1726882238.44990: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.45016: _low_level_execute_command(): starting 11728 1726882238.45032: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882238.45764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.45784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.45905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.45932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.46019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.47690: stdout chunk (state=3): >>>/root <<< 11728 1726882238.47900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.47904: stdout chunk (state=3): >>><<< 11728 1726882238.47906: stderr chunk (state=3): >>><<< 11728 1726882238.47909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.47923: _low_level_execute_command(): starting 11728 1726882238.47931: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635 `" && echo ansible-tmp-1726882238.479075-14779-60518906347635="` echo /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635 `" ) && sleep 0' 11728 1726882238.48886: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.48900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.48909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.49101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.49116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882238.49119: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882238.49121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.49123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882238.49131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.49135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.49137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.49139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.49165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.51189: stdout chunk (state=3): >>>ansible-tmp-1726882238.479075-14779-60518906347635=/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635 <<< 11728 1726882238.51400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.51404: stdout chunk (state=3): >>><<< 11728 1726882238.51407: stderr chunk (state=3): >>><<< 11728 1726882238.51410: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882238.479075-14779-60518906347635=/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.51412: variable 'ansible_module_compression' from source: unknown 11728 1726882238.51414: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882238.51438: variable 'ansible_facts' from source: unknown 11728 1726882238.51518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py 11728 1726882238.51719: Sending initial data 11728 1726882238.51722: Sent initial data (154 bytes) 11728 1726882238.52349: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.52359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.52370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.52390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.52408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882238.52512: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.52522: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.52607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.54163: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882238.54211: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882238.54276: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmp5b1kt_ei /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py <<< 11728 1726882238.54280: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py" <<< 11728 1726882238.54316: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmp5b1kt_ei" to remote "/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py" <<< 11728 1726882238.55290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.55297: stdout chunk (state=3): >>><<< 11728 1726882238.55300: stderr chunk (state=3): >>><<< 11728 1726882238.55302: done transferring module to remote 11728 1726882238.55304: _low_level_execute_command(): starting 11728 1726882238.55307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/ /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py && sleep 0' 11728 1726882238.55906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.55915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.55933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.55940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.55953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882238.55960: stderr chunk (state=3): >>>debug2: match not found <<< 11728 1726882238.55969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.55983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11728 1726882238.55990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882238.55999: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11728 1726882238.56008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.56017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.56038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.56044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 <<< 11728 1726882238.56047: stderr chunk (state=3): >>>debug2: match found <<< 11728 1726882238.56150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.56154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.56157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.56159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.56223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.57987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.58055: stderr chunk (state=3): >>><<< 11728 1726882238.58068: stdout chunk (state=3): >>><<< 11728 1726882238.58092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.58111: _low_level_execute_command(): starting 11728 1726882238.58192: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/AnsiballZ_command.py && sleep 0' 11728 1726882238.58778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11728 1726882238.58792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.58813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.58832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.58913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.58959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.58975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.58999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.59087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.74790: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3089sec preferred_lft 3089sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:30:38.737251", "end": "2024-09-20 21:30:38.746030", "delta": "0:00:00.008779", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882238.76173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882238.76206: stderr chunk (state=3): >>><<< 11728 1726882238.76210: stdout chunk (state=3): >>><<< 11728 1726882238.76225: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3089sec preferred_lft 3089sec\n inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:30:38.737251", "end": "2024-09-20 21:30:38.746030", "delta": "0:00:00.008779", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882238.76263: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882238.76269: _low_level_execute_command(): starting 11728 1726882238.76274: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882238.479075-14779-60518906347635/ > /dev/null 2>&1 && sleep 0' 11728 1726882238.76758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.76761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.76763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.76766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882238.76768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.76817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.76824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.76826: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.76874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.78639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.78665: stderr chunk (state=3): >>><<< 11728 1726882238.78668: stdout chunk (state=3): >>><<< 11728 1726882238.78682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.78692: handler run complete 11728 1726882238.78714: Evaluated conditional (False): False 11728 1726882238.78722: attempt loop complete, returning result 11728 1726882238.78725: _execute() done 11728 1726882238.78727: dumping result to json 11728 1726882238.78733: done dumping result, returning 11728 1726882238.78741: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [12673a56-9f93-5c28-a762-000000000e5a] 11728 1726882238.78746: sending task result for task 12673a56-9f93-5c28-a762-000000000e5a 11728 1726882238.78846: done sending task result for task 12673a56-9f93-5c28-a762-000000000e5a 11728 1726882238.78849: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008779", "end": "2024-09-20 21:30:38.746030", "rc": 0, "start": "2024-09-20 21:30:38.737251" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:87:27:91:87:37 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.229/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3089sec preferred_lft 3089sec inet6 fe80::1087:27ff:fe91:8737/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.229 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.229 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11728 1726882238.78919: no more pending results, returning what we have 11728 1726882238.78923: results queue empty 11728 1726882238.78924: checking for any_errors_fatal 11728 1726882238.78935: done checking for any_errors_fatal 11728 1726882238.78936: checking for max_fail_percentage 11728 1726882238.78938: done checking for max_fail_percentage 11728 1726882238.78939: checking to see if all hosts have failed and the running result is not ok 11728 1726882238.78940: done checking to see if all hosts have failed 11728 1726882238.78940: getting the remaining hosts for this loop 11728 1726882238.78942: done getting the remaining hosts for this loop 11728 1726882238.78945: getting the next task for host managed_node3 11728 1726882238.78952: done getting next task for host managed_node3 11728 1726882238.78954: ^ task is: TASK: Verify DNS and network connectivity 11728 1726882238.78964: ^ state is: HOST STATE: block=5, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=4, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882238.78972: getting variables 11728 1726882238.78974: in VariableManager get_vars() 11728 1726882238.79022: Calling all_inventory to load vars for managed_node3 11728 1726882238.79025: Calling groups_inventory to load vars for managed_node3 11728 1726882238.79027: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882238.79037: Calling all_plugins_play to load vars for managed_node3 11728 1726882238.79040: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882238.79042: Calling groups_plugins_play to load vars for managed_node3 11728 1726882238.79986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882238.81538: done with get_vars() 11728 1726882238.81566: done getting variables 11728 1726882238.81636: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:30:38 -0400 (0:00:00.388) 0:01:03.669 ****** 11728 1726882238.81670: entering _queue_task() for managed_node3/shell 11728 1726882238.82034: worker is 1 (out of 1 available) 11728 1726882238.82047: exiting _queue_task() for managed_node3/shell 11728 1726882238.82061: done queuing things up, now waiting for results queue to drain 11728 1726882238.82063: waiting for pending results... 11728 1726882238.82516: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 11728 1726882238.82527: in run() - task 12673a56-9f93-5c28-a762-000000000e5b 11728 1726882238.82550: variable 'ansible_search_path' from source: unknown 11728 1726882238.82560: variable 'ansible_search_path' from source: unknown 11728 1726882238.82612: calling self._execute() 11728 1726882238.82744: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.82831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.82835: variable 'omit' from source: magic vars 11728 1726882238.83169: variable 'ansible_distribution_major_version' from source: facts 11728 1726882238.83188: Evaluated conditional (ansible_distribution_major_version != '6'): True 11728 1726882238.83342: variable 'ansible_facts' from source: unknown 11728 1726882238.84432: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11728 1726882238.84446: variable 'omit' from source: magic vars 11728 1726882238.84509: variable 'omit' from source: magic vars 11728 1726882238.84550: variable 'omit' from source: magic vars 11728 1726882238.84583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11728 1726882238.84619: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11728 1726882238.84632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11728 1726882238.84646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.84657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11728 1726882238.84681: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11728 1726882238.84684: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.84687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.84760: Set connection var ansible_connection to ssh 11728 1726882238.84769: Set connection var ansible_shell_executable to /bin/sh 11728 1726882238.84774: Set connection var ansible_timeout to 10 11728 1726882238.84776: Set connection var ansible_shell_type to sh 11728 1726882238.84784: Set connection var ansible_module_compression to ZIP_DEFLATED 11728 1726882238.84788: Set connection var ansible_pipelining to False 11728 1726882238.84811: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.84814: variable 'ansible_connection' from source: unknown 11728 1726882238.84816: variable 'ansible_module_compression' from source: unknown 11728 1726882238.84819: variable 'ansible_shell_type' from source: unknown 11728 1726882238.84822: variable 'ansible_shell_executable' from source: unknown 11728 1726882238.84824: variable 'ansible_host' from source: host vars for 'managed_node3' 11728 1726882238.84828: variable 'ansible_pipelining' from source: unknown 11728 1726882238.84832: variable 'ansible_timeout' from source: unknown 11728 1726882238.84834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11728 1726882238.84940: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.84950: variable 'omit' from source: magic vars 11728 1726882238.84955: starting attempt loop 11728 1726882238.84958: running the handler 11728 1726882238.84967: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11728 1726882238.84985: _low_level_execute_command(): starting 11728 1726882238.84989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11728 1726882238.85465: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.85505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.85508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.85511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.85514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.85561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.85564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11728 1726882238.85566: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.85625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.87272: stdout chunk (state=3): >>>/root <<< 11728 1726882238.87366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.87396: stderr chunk (state=3): >>><<< 11728 1726882238.87400: stdout chunk (state=3): >>><<< 11728 1726882238.87419: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.87434: _low_level_execute_command(): starting 11728 1726882238.87439: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669 `" && echo ansible-tmp-1726882238.8741865-14800-47197012253669="` echo /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669 `" ) && sleep 0' 11728 1726882238.87881: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.87886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882238.87888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.87891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.87941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.87945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.87996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.89847: stdout chunk (state=3): >>>ansible-tmp-1726882238.8741865-14800-47197012253669=/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669 <<< 11728 1726882238.89951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.89976: stderr chunk (state=3): >>><<< 11728 1726882238.89979: stdout chunk (state=3): >>><<< 11728 1726882238.89998: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882238.8741865-14800-47197012253669=/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.90023: variable 'ansible_module_compression' from source: unknown 11728 1726882238.90067: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-117283mmjgip2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11728 1726882238.90097: variable 'ansible_facts' from source: unknown 11728 1726882238.90151: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py 11728 1726882238.90249: Sending initial data 11728 1726882238.90252: Sent initial data (155 bytes) 11728 1726882238.90664: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.90698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11728 1726882238.90703: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found <<< 11728 1726882238.90706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.90710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.90713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.90765: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.90768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.90823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.92331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11728 1726882238.92334: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11728 1726882238.92373: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11728 1726882238.92421: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-117283mmjgip2/tmpso1cbkhm /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py <<< 11728 1726882238.92428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py" <<< 11728 1726882238.92465: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-117283mmjgip2/tmpso1cbkhm" to remote "/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py" <<< 11728 1726882238.93012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.93048: stderr chunk (state=3): >>><<< 11728 1726882238.93051: stdout chunk (state=3): >>><<< 11728 1726882238.93092: done transferring module to remote 11728 1726882238.93104: _low_level_execute_command(): starting 11728 1726882238.93108: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/ /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py && sleep 0' 11728 1726882238.93545: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11728 1726882238.93548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.93554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address <<< 11728 1726882238.93556: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found <<< 11728 1726882238.93558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.93598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.93602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.93656: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882238.95350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882238.95376: stderr chunk (state=3): >>><<< 11728 1726882238.95379: stdout chunk (state=3): >>><<< 11728 1726882238.95391: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882238.95396: _low_level_execute_command(): starting 11728 1726882238.95403: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/AnsiballZ_command.py && sleep 0' 11728 1726882238.95820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.95824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.95826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882238.95828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882238.95877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882238.95880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882238.95935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882239.19445: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6754 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15286 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:30:39.107128", "end": "2024-09-20 21:30:39.192631", "delta": "0:00:00.085503", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11728 1726882239.21113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. <<< 11728 1726882239.21117: stdout chunk (state=3): >>><<< 11728 1726882239.21119: stderr chunk (state=3): >>><<< 11728 1726882239.21207: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 6754 0 --:--:-- --:--:-- --:--:-- 6777\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15286 0 --:--:-- --:--:-- --:--:-- 16166", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:30:39.107128", "end": "2024-09-20 21:30:39.192631", "delta": "0:00:00.085503", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.229 closed. 11728 1726882239.21218: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11728 1726882239.21222: _low_level_execute_command(): starting 11728 1726882239.21225: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882238.8741865-14800-47197012253669/ > /dev/null 2>&1 && sleep 0' 11728 1726882239.21797: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11728 1726882239.21813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882239.21828: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11728 1726882239.21875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' <<< 11728 1726882239.21899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11728 1726882239.21936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11728 1726882239.23785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11728 1726882239.23791: stdout chunk (state=3): >>><<< 11728 1726882239.23795: stderr chunk (state=3): >>><<< 11728 1726882239.23999: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.229 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.229 originally 10.31.10.229 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/537759ca41' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11728 1726882239.24003: handler run complete 11728 1726882239.24005: Evaluated conditional (False): False 11728 1726882239.24007: attempt loop complete, returning result 11728 1726882239.24009: _execute() done 11728 1726882239.24011: dumping result to json 11728 1726882239.24013: done dumping result, returning 11728 1726882239.24014: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [12673a56-9f93-5c28-a762-000000000e5b] 11728 1726882239.24016: sending task result for task 12673a56-9f93-5c28-a762-000000000e5b 11728 1726882239.24092: done sending task result for task 12673a56-9f93-5c28-a762-000000000e5b ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.085503", "end": "2024-09-20 21:30:39.192631", "rc": 0, "start": "2024-09-20 21:30:39.107128" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 6754 0 --:--:-- --:--:-- --:--:-- 6777 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 15286 0 --:--:-- --:--:-- --:--:-- 16166 11728 1726882239.24177: no more pending results, returning what we have 11728 1726882239.24180: results queue empty 11728 1726882239.24181: checking for any_errors_fatal 11728 1726882239.24190: done checking for any_errors_fatal 11728 1726882239.24191: checking for max_fail_percentage 11728 1726882239.24194: done checking for max_fail_percentage 11728 1726882239.24196: checking to see if all hosts have failed and the running result is not ok 11728 1726882239.24197: done checking to see if all hosts have failed 11728 1726882239.24198: getting the remaining hosts for this loop 11728 1726882239.24201: done getting the remaining hosts for this loop 11728 1726882239.24233: getting the next task for host managed_node3 11728 1726882239.24243: done getting next task for host managed_node3 11728 1726882239.24246: ^ task is: TASK: meta (flush_handlers) 11728 1726882239.24251: WORKER PROCESS EXITING 11728 1726882239.24256: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882239.24263: getting variables 11728 1726882239.24264: in VariableManager get_vars() 11728 1726882239.24338: Calling all_inventory to load vars for managed_node3 11728 1726882239.24341: Calling groups_inventory to load vars for managed_node3 11728 1726882239.24344: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882239.24354: Calling all_plugins_play to load vars for managed_node3 11728 1726882239.24356: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882239.24359: Calling groups_plugins_play to load vars for managed_node3 11728 1726882239.29504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882239.30998: done with get_vars() 11728 1726882239.31027: done getting variables 11728 1726882239.31068: in VariableManager get_vars() 11728 1726882239.31084: Calling all_inventory to load vars for managed_node3 11728 1726882239.31086: Calling groups_inventory to load vars for managed_node3 11728 1726882239.31087: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882239.31090: Calling all_plugins_play to load vars for managed_node3 11728 1726882239.31092: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882239.31097: Calling groups_plugins_play to load vars for managed_node3 11728 1726882239.31721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882239.32556: done with get_vars() 11728 1726882239.32574: done queuing things up, now waiting for results queue to drain 11728 1726882239.32575: results queue empty 11728 1726882239.32576: checking for any_errors_fatal 11728 1726882239.32579: done checking for any_errors_fatal 11728 1726882239.32579: checking for max_fail_percentage 11728 1726882239.32580: done checking for max_fail_percentage 11728 1726882239.32580: checking to see if all hosts have failed and the running result is not ok 11728 1726882239.32581: done checking to see if all hosts have failed 11728 1726882239.32581: getting the remaining hosts for this loop 11728 1726882239.32582: done getting the remaining hosts for this loop 11728 1726882239.32584: getting the next task for host managed_node3 11728 1726882239.32587: done getting next task for host managed_node3 11728 1726882239.32589: ^ task is: TASK: meta (flush_handlers) 11728 1726882239.32590: ^ state is: HOST STATE: block=7, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882239.32592: getting variables 11728 1726882239.32592: in VariableManager get_vars() 11728 1726882239.32606: Calling all_inventory to load vars for managed_node3 11728 1726882239.32608: Calling groups_inventory to load vars for managed_node3 11728 1726882239.32609: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882239.32614: Calling all_plugins_play to load vars for managed_node3 11728 1726882239.32615: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882239.32617: Calling groups_plugins_play to load vars for managed_node3 11728 1726882239.33718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882239.34558: done with get_vars() 11728 1726882239.34571: done getting variables 11728 1726882239.34608: in VariableManager get_vars() 11728 1726882239.34618: Calling all_inventory to load vars for managed_node3 11728 1726882239.34620: Calling groups_inventory to load vars for managed_node3 11728 1726882239.34621: Calling all_plugins_inventory to load vars for managed_node3 11728 1726882239.34624: Calling all_plugins_play to load vars for managed_node3 11728 1726882239.34626: Calling groups_plugins_inventory to load vars for managed_node3 11728 1726882239.34628: Calling groups_plugins_play to load vars for managed_node3 11728 1726882239.35237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11728 1726882239.36159: done with get_vars() 11728 1726882239.36175: done queuing things up, now waiting for results queue to drain 11728 1726882239.36176: results queue empty 11728 1726882239.36177: checking for any_errors_fatal 11728 1726882239.36178: done checking for any_errors_fatal 11728 1726882239.36178: checking for max_fail_percentage 11728 1726882239.36179: done checking for max_fail_percentage 11728 1726882239.36179: checking to see if all hosts have failed and the running result is not ok 11728 1726882239.36180: done checking to see if all hosts have failed 11728 1726882239.36180: getting the remaining hosts for this loop 11728 1726882239.36181: done getting the remaining hosts for this loop 11728 1726882239.36183: getting the next task for host managed_node3 11728 1726882239.36185: done getting next task for host managed_node3 11728 1726882239.36186: ^ task is: None 11728 1726882239.36187: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11728 1726882239.36187: done queuing things up, now waiting for results queue to drain 11728 1726882239.36188: results queue empty 11728 1726882239.36188: checking for any_errors_fatal 11728 1726882239.36189: done checking for any_errors_fatal 11728 1726882239.36189: checking for max_fail_percentage 11728 1726882239.36190: done checking for max_fail_percentage 11728 1726882239.36190: checking to see if all hosts have failed and the running result is not ok 11728 1726882239.36191: done checking to see if all hosts have failed 11728 1726882239.36192: getting the next task for host managed_node3 11728 1726882239.36197: done getting next task for host managed_node3 11728 1726882239.36198: ^ task is: None 11728 1726882239.36198: ^ state is: HOST STATE: block=8, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=148 changed=4 unreachable=0 failed=0 skipped=97 rescued=0 ignored=0 Friday 20 September 2024 21:30:39 -0400 (0:00:00.545) 0:01:04.214 ****** =============================================================================== ** TEST check bond settings --------------------------------------------- 6.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 Gathering Facts --------------------------------------------------------- 1.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_options_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 ** TEST check bond settings --------------------------------------------- 1.87s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_bond_options.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.84s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.83s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.79s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Create test interfaces -------------------------------------------------- 1.54s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which packages are installed --- 1.03s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.98s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_options.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.93s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.92s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.89s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.86s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install dnsmasq --------------------------------------------------------- 0.86s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.83s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 11728 1726882239.36276: RUNNING CLEANUP